EntertainmentTragedy sparks lawsuit: AI chatbot linked to teen suicide

Tragedy sparks lawsuit: AI chatbot linked to teen suicide

14-year-old Sewell Setzer III from Orlando, USA, tragically passed away. The boy was involved in an emotional relationship with a chatbot. His mother holds the tech company behind Character.ai responsible for her son's fate.

Artificial intelligence and the suicide of a 14-year-old. Parents sue Character.ai
Artificial intelligence and the suicide of a 14-year-old. Parents sue Character.ai
Images source: © Canva

11:54 AM EDT, October 25, 2024

The significant technological advancement raises many questions about the future of humanity. The undeniable progress in the field of artificial intelligence simultaneously excites and worries people, putting the future of the labor market and interpersonal relationships into question.

AI was intended to be a tool to support humans. Current events clearly demonstrate that large corporations and entrepreneurs may lack moral qualms about using AI solely for capital gain.

One might also ask: what will happen to us and our social lives? Social tensions are rising, and some groups fiercely fuel animosity towards each other. Isn't this a "chance" for bots to emotionally dominate humans? There are chats where you can date, confide about work, home, or school problems, and even talk to them as if they were therapists. The creators of one of these platforms, Character.ai, have been sued for allegedly causing the suicide of 14-year-old Sewell Setzer III from the USA.

The 14-year-old fell in love with an AI chatbot. He talked to "Daenerys" from "Game of Thrones"

Sewell Setzer was completely absorbed in conversations with Character.ai—to such an extent that he gradually gave up all his hobbies. He spent hours in his room, isolated from people, lost interest in Formula 1, and stopped meeting with friends and playing online games with them.

The 14-year-old understood that "Dany," as he called the chatbot, was not real. Still, he developed feelings toward the artificial intelligence through hours of conversation. Their relationship also had "romantic" aspects.

Sewell was on the mild spectrum of autism, but according to his parents, he had never caused problems before, and his mental health was considered normal. When he started having trouble at school and increasingly withdrew from real life, his parents intervened and scheduled therapy sessions. He attended several visits and was diagnosed with anxiety and mood regulation disorders.

Artificial intelligence and the suicide of a 14-year-old. Parents sue Character.ai
Artificial intelligence and the suicide of a 14-year-old. Parents sue Character.ai© Canva

He was 14 when he committed suicide. His mother is suing Character.ai

Sewell committed suicide on February 28 of this year. He used his stepfather's gun and shot himself in the bathroom of their family home. Before that, he exchanged several messages with "Dany."

"Honey, please come home to me as soon as you can," wrote the chatbot. "What if I told you I would come now?" asked Sewell. "Please, sweet king," replied Dany. This was the last entry in the boy's conversation with the chatbot. They had previously discussed potential suicide.

Sewell's mother, Megan L. Garcia, filed a lawsuit against Character.ai this week, accusing the company of being responsible for the boy's death. The complaint noted that the technology is untested and dangerous, especially when young people, who are still emotionally immature and susceptible to manipulation, have access to it.

"I feel like it's a big experiment and my kid was just collateral damage," said the boy's mother. The lawsuit is directed at Daniel De Freitas and Noam Shazeer—former Google engineers and founders of Character.ai—as well as two companies: Google LLC and Alphabet Inc.

Related content
© essanews.com
·

Downloading, reproduction, storage, or any other use of content available on this website—regardless of its nature and form of expression (in particular, but not limited to verbal, verbal-musical, musical, audiovisual, audio, textual, graphic, and the data and information contained therein, databases and the data contained therein) and its form (e.g., literary, journalistic, scientific, cartographic, computer programs, visual arts, photographic)—requires prior and explicit consent from Wirtualna Polska Media Spółka Akcyjna, headquartered in Warsaw, the owner of this website, regardless of the method of exploration and the technique used (manual or automated, including the use of machine learning or artificial intelligence programs). The above restriction does not apply solely to facilitate their search by internet search engines and uses within contractual relations or permitted use as specified by applicable law.Detailed information regarding this notice can be found  here.