ChatGPT conversations linked to increased feelings of isolation


A pair of studies conducted by OpenAI and MIT Media Lab found a small percentage of test subjects who used ChatGPT extensively reported increased loneliness and emotional dependence, as well as reduced social interaction.

In other words, the research indicates that lonely people are more likely to seek emotional connection with AI-powered bots. That says a lot about how people are navigating relationships, how we’re increasingly relying on technology, and how we’re incorporating it deeply into more aspects of our lives than just getting things done.

It also raises the question of how we’ll interact with chatbots in the future, and the sort of effect that could have on us.

One study conducted by the OpenAI team analyzed more than 4 million ChatGPT conversations from 4,076 participating users, who voluntarily reported how they felt about using the service.

In the other study, researchers from MIT Media Lab had 981 people use ChatGPT for at least five minutes daily for four weeks. These participants were then surveyed about their perception of ChatGPT, as well as their own state of loneliness and connection in the real world, the social interactions they engaged in, and whether they saw their use of the AI service as problematic in any way.

In case you didn’t immediately make the connection: OpenAI develops and markets ChatGPT. So yes, this is quite a self-aware move on the company’s part to examine whether its own product has a negative effect on its target audience, and if there’s anything if it can learn about preventing those from worsening.

From the two studies – both of which are yet to be peer-reviewed – the researchers found that most people don’t foster deep emotional connections with ChatGPT, and that’s true even for some of the most frequent users of its realistic Advanced Voice Mode (where you can have a fairly natural back-and-forth conversation with the bot).

The studies noted some correlation between having ‘personal’ conversations with ChatGPT and experiencing loneliness. At the same time, such usage was associated with lower emotional dependence. So it’s a bit of a mixed bag.

As Casey Newton writes in his Platformer newsletter, it’s possible that “sufficiently compelling chatbots will pull people away from human connections, possibly making them feel lonelier and more dependent on the synthetic companion they must pay to maintain a connection with.”

Deeper and more specific research will be necessary to get a clearer picture of the impact on people’s well-being as they continue to use such services. But some are already capitalizing on the human interest and need for connection, with AI companions offering an avenue to feel like you’re building bonds.

That’s not to say AI chatbots are bad for us in every way. For some people, they can provide meaningful ways to ease feelings of loneliness and find ways to privately express and reflect on what they’re going through.

However, this research shows there’s a need for platforms to develop their bots more responsibly, while being cognizant of how invested people could get in connecting with them. At the same time, regulatory authorities need to create frameworks to prevent businesses from exploiting deeply engaged users, and to encourage companies developing AI systems to actively prioritize their audience’s well-being.

Read more about the studies from MIT Media Lab and from OpenAI, and find Newton’s thoughtful newsletter here.

Source: MIT Media Lab



Leave a Reply

Your email address will not be published. Required fields are marked *