March 31, 2025

If you use ChatGPT a lot, this study has some concerning findings for you

If You Use ChatGPT A Lot, This Study Has Some Concerning Findings For You
AI-generated image.

Although AI only started permeating the online world a couple of years ago, billions of people are already using it. If you’re reading this, there’s a good chance you use ChatGPT for quick questions, emails, or creative brainstorming. But over the past two years, as the chatbot added features like a human-like voice and memory, researchers noticed that more and more people are treating it less like a tool and more like a companion.

In a joint study conducted by MIT and OpenAI scientists, researchers tackled an unpleasant question: Does spending time with a highly conversational AI make people feel emotionally attached — or even addicted?

So, you think ChatGPT is your friend?

The study was a tightly controlled 28-day experiment. Over the period, they analyzed over 40 million ChatGPT interactions and surveyed over 4,000 users. In parallel, nearly 1,000 participants took part in a randomized controlled trial (RCT), using ChatGPT daily for four weeks under various experimental conditions.

Across both studies, they found that a small percentage of users were responsible for a disproportionate amount of “affective use.” Affective use refers to chats marked by emotional content, intimacy sharing, and signs of dependency. The researchers performed an automated analysis on these conversations, using classifiers to flag conversations for emotional indicators. Although, they concede, these classifiers can lack nuance. They also tracked how often users activated these emotional cues over time.

If you’re wondering whether your regular use of ChatGPT means you’re on the slippery slope to AI addiction, you probably shouldn’t worry — most users aren’t showing signs of trouble. The majority engaged in neutral, task-oriented conversations. They saw ChatGPT as a helpful assistant, not a shoulder to cry on.

“Even among heavy users, high degrees of affective use are limited to a small group,” scientists write in a release on the study. These were the users who were most likely to agree with “I consider ChatGPT to be a friend.”

Who gravitates towards emotional use

The researchers also mention the usage of ChatGPT’s voice feature. One might assume that ChatGPT’s voice makes it more “addictive,” but the picture is more complicated.

In fact, users of voice mode (especially the engaging version) reported better emotional well-being when usage time was controlled. They were less lonely, less emotionally dependent, and less prone to problematic use compared to text-only users. But when usage time increased significantly, even voice-mode users began reporting worse outcomes.

<!– Tag ID: zmescience_300x250_InContent_3

[jeg_zmescience_ad_auto size=”__300x250″ id=”zmescience_300x250_InContent_3″]

–>

This suggests a self-selection effect. People seeking emotional connection might naturally gravitate to voice chat, where responses feel more personal. But the technology itself isn’t inherently harmful. It’s the intensity of the engagement — and the person’s baseline mental state — that tip the scales.

In the big picture, it seems that people who start out lonelier are most likely to turn to AI for companionship. These people are more likely to develop what psychologists call a “parasocial relationship,” where someone forms a one-sided emotional bond with a media figure (or in this case, an AI). This type of relationship is extremely one-sided. And like parasocial bonds with influencers or fictional characters, these relationships can sometimes provide comfort — but they can also blur the lines between reality and simulation.

Not quite addiction

It’s not exactly addiction, but the researchers say it is “problematic use,” borrowing the term from behavioral psychology and digital media research. Users who engaged emotionally with ChatGPT showed decreased social interaction with others, higher emotional dependence, and increased feelings of loneliness (especially in those starting off lonely).

Are we headed towards a world where we start to consider algorithms our friends, or will we implement some helpful guardrails?

As always seems to be the case with AI, the challenge is huge. AI is getting more natural, more accessible, and more embedded in daily life. As it learns to mirror your tone, remember your preferences, and speak with human warmth, the temptation to lean on it emotionally will grow. So will the risk of crossing a line — from using a tool to needing a friend.

In the meantime, perhaps it would be useful to ask yourself a question when you’re using AI. Am I using this to get things done or to feel less alone?

You can read the entire report here.