Using ChatGPT too much can create emotional dependency, study finds

OpenAI seems to be announcing new AI models by the week to improve its ChatGPT chatbot for the betterment of its 400 million users. However, the ease the AI tool provides seems to prove that it’s possible to have too much of a good thing.

The artificial intelligence company is now delving into the potential psychological ramifications that ChatGPT might have on its users. OpenAI has published the results of a two-part study completed alongside MIT Media Lab, which uncovered a connection between increased usage of the ChatGPT chatbot and users’ increased feelings of loneliness.

Recommended Videos

Each organization conducted an independent study and then compiled the results to a consolidated conclusion. OpenAI’s study examined over one month “over 40 million ChatGPT interactions,” which didn’t include human involvement to maintain user privacy. Meanwhile, MIT observed approximately 1,000 participants using ChatGPT over 28 days. Currently, the studies have not yet been peer-reviewed.

MIT’s study delved into different use functions that could affect users’ emotional experience interacting with ChatGPT, including using text or voice. Results found that either medium had the potential to elicit loneliness or to affect users’ socialization during the time of the study. Voice inflection and topic choice were also a major point of comparison.

A neutral tone used in ChatGPT’s voice mode was less likely to lead to a negative emotional outcome for participants. Meanwhile, the study observed a correlation between participants having personal conversations with ChatGPT and the increased likelihood of loneliness; however, these effects were short-term. Those using text chat even to converse about general topics experienced increased instances of emotional dependence on the chatbot.

The study also observed that those who reported viewing ChatGPT as a friend, and those who already had a propensity toward strong emotional attachment in relationships, were more likely to feel lonelier and more emotionally dependent on the chatbot while participating in the study.

OpenAI’s study added additional context, with its results noting overall that interacting with ChatGPT for emotional purposes was rare. Additionally, the study found that even among heavy users who implemented the Advanced Voice Mode feature on the chatbot and were more likely to answer that they considered ChatGPT to be a friend, this group of participants experienced low emotional reactions to interacting with the chatbot.

OpenAI concluded that its intent with these studies is to understand the challenges that might arise due to its technology, as well as to be able to set expectations and examples for how its models should be used.

While OpenAI suggests that its interaction-based study simulates the behaviors of real people, more than a few real humans have admitted on public forums, such as Reddit, to using ChatGPT in place of going to a therapist with their emotions.

Comments on "Using ChatGPT too much can create emotional dependency, study finds" :

Leave a Reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

AI could soon speak dog and cat
COMPUTING

AI could soon speak dog and cat

Imagine what it would be like to know exactly what your dog was saying when it barked, or your cat w...

Read More →
Google Gemini eases web surfing for users with vision and hearing issues
COMPUTING

Google Gemini eases web surfing for users with vision and hearing issues

Android devices have offered a built-in screen reader feature called TalkBack for years. It helps pe...

Read More →
Can AI really replace your keyboard and mouse?
COMPUTING

Can AI really replace your keyboard and mouse?

“Hey ChatGPT, left-click on the enter password field in the pop-up window appearing in the lower l...

Read More →
AI can do a lot of things but it can’t make games — or even play them yet
COMPUTING

AI can do a lot of things but it can’t make games — or even play them yet

As AI tools improve, we keep getting encouraged to offload more and more complex tasks to them. LLMs...

Read More →
ChatGPT now interprets photos better than an art critic and an investigator combined
COMPUTING

ChatGPT now interprets photos better than an art critic and an investigator combined

ChatGPT’s recent image generation capabilities have challenged our previous understanding of AI-ge...

Read More →
Why writing with ChatGPT actually makes my life harder
COMPUTING

Why writing with ChatGPT actually makes my life harder

I remember when ChatGPT first appeared, and the first thing everyone started saying was “Writers a...

Read More →
Google might have to sell Chrome — and OpenAI wants to buy it
COMPUTING

Google might have to sell Chrome — and OpenAI wants to buy it

It feels like all of the big tech companies practically live in courtrooms lately, but it also feels...

Read More →
PlayStation is testing AI-driven characters, and I’m not a fan
COMPUTING

PlayStation is testing AI-driven characters, and I’m not a fan

If you’ve been dreaming about having a more in-depth conversation with one of your favorite gaming...

Read More →
OpenAI cracks down on ChatGPT scammers
COMPUTING

OpenAI cracks down on ChatGPT scammers

OpenAI has made it clear that its flagship AI service, ChatGPT is not intended for malicious use.The...

Read More →