ChatGPT's impact on mental health: Users see AI as a friend
The MIT Media Lab and OpenAI team have investigated how interacting with ChatGPT affects users' emotional well-being. Researchers discovered that "power users," or those who use ChatGPT very frequently, may become emotionally dependent on this tool.
ChatGPT is a relatively new technology, and its long-term effects are still unknown. While it is undoubtedly a revolutionary development that facilitates daily tasks, recent studies suggest that some people already can't imagine a day without it and treat it like a personal advisor. The collected data seems particularly relevant to the most active users of this technology.
Users treat ChatGPT like a friend
The MIT Media Lab team, in collaboration with OpenAI, examined how interactions with a chatbot like ChatGPT impact users' social and emotional well-being. They observed that people use this tool in various unforeseen ways, which don't always align with the creators' intentions. Their findings indicate complex relationships between users and AI, with many users starting to view ChatGPT as a friend. The researchers emphasized, "ChatGPT isn’t designed to replace or mimic human relationships, but people may choose to use it that way given its conversational style and expanding capabilities."
The research consisted of two parts. The first was an analysis of 40 million interactions with ChatGPT, conducted by OpenAI. "This analysis helped understand usage patterns and their impact on users' emotional well-being," the scientists reported. The second part was a controlled study involving 1,000 participants aimed at identifying the impact of various platform features on users' psychosocial state.
Emotional involvement with voice messages
The test results showed that most daily interactions with the chatbot did not indicate signs of emotional involvement. However, the situation was different for those using voice messages. In this group, conversations were much more expressive and emotional, with users more likely to consider ChatGPT as a friend. Moreover, intensive use of voice chat was associated with a decline in mental well-being.
Furthermore, the study highlighted concerning findings among those who used ChatGPT for personal conversations, such as seeking advice. These users reported feeling more lonely, even though they did not exhibit signs of emotional dependency. Conversely, dependency was more noticeable among those using the chat for professional purposes or to generate new ideas.
Who is most susceptible to negative effects?
Researchers noted that individuals who easily form attachments and those who perceive AI as a friend were more susceptible to the negative effects of using the chatbot. Long-term and frequent use of ChatGPT was also linked to worsening mental health.
One of the key findings of the study was that frequent use of the chat, regardless of the conversation topic or form (voice or text), correlated with a decline in mental well-being. Therefore, it is reasonable to suggest that excessive use of AI technology can lead to a decreased quality of life.
Research results need careful interpretation
The creators of ChatGPT are aware of the growing concerns and have initiated research on the impact of AI on users' mental states. They expressed hope that academic communities and the tech industry will thoroughly examine human-AI relationships. "The findings have yet to be peer-reviewed by the scientific community, meaning that they should be interpreted cautiously," the authors of the analysis emphasized. Moreover, the research only focused on ChatGPT users in the US, highlighting the need for further studies in different languages and cultures.
The study results are an important step in understanding the impact of advanced AI models on human experiences. "These studies represent a critical first step in understanding the impact of advanced AI models on human experience and well-being," the researchers added. They also pointed out the need for more research to better understand how and why AI use affects users. Their goal is to create AI that maximizes benefits for users while minimizing potential harm, especially concerning mental well-being and technology dependency. This research aims to address emerging challenges for both OpenAI and the entire industry.
The study results provoke reflection on how technology can affect our emotions, relationships, and perception of reality. It all resembles the plot of the highly relevant movie "Her" from 2013, directed by Spike Jonze, about a lonely writer who formed a very emotional bond with an AI-equipped operating system reminiscent of ChatGPT.
ChatGPT like the movie operating system "Samantha"
This system, named Samantha, quickly became a close companion to the man, and their relationship began to be filled with strong emotions, primarily due to voice interactions. Although filmed over a decade ago, it aptly reflects issues of loneliness, human relationships, technology dependence, and the boundaries between humans and machines.