Using ChatGPT too much can create emotional dependency, study finds

OpenAI seems to be announcing new AI models by the week to improve its ChatGPT chatbot for the betterment of its 400 million users. However, the ease the AI tool provides seems to prove that it’s possible to have too much of a good thing.

The artificial intelligence company is now delving into the potential psychological ramifications that ChatGPT might have on its users. OpenAI has published the results of a two-part study completed alongside MIT Media Lab, which uncovered a connection between increased usage of the ChatGPT chatbot and users’ increased feelings of loneliness.

Recommended Videos

Each organization conducted an independent study and then compiled the results to a consolidated conclusion. OpenAI’s study examined over one month “over 40 million ChatGPT interactions,” which didn’t include human involvement to maintain user privacy. Meanwhile, MIT observed approximately 1,000 participants using ChatGPT over 28 days. Currently, the studies have not yet been peer-reviewed.

MIT’s study delved into different use functions that could affect users’ emotional experience interacting with ChatGPT, including using text or voice. Results found that either medium had the potential to elicit loneliness or to affect users’ socialization during the time of the study. Voice inflection and topic choice were also a major point of comparison.

A neutral tone used in ChatGPT’s voice mode was less likely to lead to a negative emotional outcome for participants. Meanwhile, the study observed a correlation between participants having personal conversations with ChatGPT and the increased likelihood of loneliness; however, these effects were short-term. Those using text chat even to converse about general topics experienced increased instances of emotional dependence on the chatbot.

The study also observed that those who reported viewing ChatGPT as a friend, and those who already had a propensity toward strong emotional attachment in relationships, were more likely to feel lonelier and more emotionally dependent on the chatbot while participating in the study.

OpenAI’s study added additional context, with its results noting overall that interacting with ChatGPT for emotional purposes was rare. Additionally, the study found that even among heavy users who implemented the Advanced Voice Mode feature on the chatbot and were more likely to answer that they considered ChatGPT to be a friend, this group of participants experienced low emotional reactions to interacting with the chatbot.

OpenAI concluded that its intent with these studies is to understand the challenges that might arise due to its technology, as well as to be able to set expectations and examples for how its models should be used.

While OpenAI suggests that its interaction-based study simulates the behaviors of real people, more than a few real humans have admitted on public forums, such as Reddit, to using ChatGPT in place of going to a therapist with their emotions.

Comments on "Using ChatGPT too much can create emotional dependency, study finds" :

Leave a Reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

The hottest new ChatGPT trend is disturbingly morbid
COMPUTING

The hottest new ChatGPT trend is disturbingly morbid

The rise of AI has helped us make some huge leaps. From helping with medicine research to spotting c...

Read More →
GPT-4o is back on ChatGPT; OpenAI relents following huge backlash
COMPUTING

GPT-4o is back on ChatGPT; OpenAI relents following huge backlash

OpenAI, the makers of ChatGPT, have performed something of an about-face after fans were upset that ...

Read More →
Honor 400 series phones will turn pictures into fun videos using AI
COMPUTING

Honor 400 series phones will turn pictures into fun videos using AI

Chinese smartphone brand, Honor, is usually at the helm of camera-centric smartphone innovations and...

Read More →
Blackmailers, spys, and cheaters beware: Signal cuts off Microsoft screengrab feature
COMPUTING

Blackmailers, spys, and cheaters beware: Signal cuts off Microsoft screengrab feature

Signal, the popular privacy-centric messaging app, has rolled out a significant update to its Window...

Read More →
Microsoft announces major AI upgrade for Windows with smarter Copilot feature
COMPUTING

Microsoft announces major AI upgrade for Windows with smarter Copilot feature

Microsoft announced improvements to Copilot in a blog post today, including a new Vision feature tha...

Read More →
Netflix search could soon get an AI boost, and it sounds like a boon
COMPUTING

Netflix search could soon get an AI boost, and it sounds like a boon

There’s a running joke in the streaming world. By the time you decide what to watch, the snacks ar...

Read More →
OpenAI plans to make Deep Research free on ChatGPT, in response to competition
COMPUTING

OpenAI plans to make Deep Research free on ChatGPT, in response to competition

OpenAI has plans to soon make its Deep Research function available for free tier ChatGPT users.The f...

Read More →
Copilot for Gaming is like Xbox’s Nintendo tip line, but for AI
COMPUTING

Copilot for Gaming is like Xbox’s Nintendo tip line, but for AI

Copilot Is Coming To Gaming, Xbox Play Anywhere Updates, And More Official Xbox PodcastCopilot for ...

Read More →
OpenAI showing a ‘very dangerous mentality’ regarding safety, expert warns
COMPUTING

OpenAI showing a ‘very dangerous mentality’ regarding safety, expert warns

An AI expert has accused OpenAI of rewriting its history and being overly dismissive of safety conce...

Read More →