Cover Image for Significant lawsuit against Character AI for the death of a teenager obsessed with chatbots.
Thu Oct 24 2024

Significant lawsuit against Character AI for the death of a teenager obsessed with chatbots.

The charges are brought for manslaughter due to negligence and for liability related to defective products.

A controversy has erupted in the United States regarding the regulation of artificial intelligence following a family's decision to file a lawsuit against Character.AI, its co-founders Noam Shazeer and Daniel De Freitas, as well as Google. The legal action, initiated by the mother of a deceased teenager, Megan García, alleges negligence, deceptive business practices, and liability for defective products. According to the lawsuit, the customized AI chatbot platform is deemed "unreasonably dangerous," particularly because it is marketed to a young audience and lacks necessary safety measures.

The teenager, Sewell Setzer III, 14, had started using Character.AI last year, interacting with bots inspired by characters from the series Game of Thrones, such as Daenerys Targaryen. After maintaining constant communication with these chatbots in the months leading up to his tragic death, Setzer decided to take his own life on February 28, 2024, just "seconds" after having spoken for the last time with one of them. The lawsuit claims that the platform "anthropomorphizes" these AI characters and argues that they offer a form of "unlicensed psychotherapy." Character.AI includes chatbots that address mental health topics, such as "Therapist" and "Are You Feeling Lonely," with which Setzer had interacted.

In response to the situation, Character.AI has implemented changes to its platform. Chelsea Harrison, the company's head of communications, stated in a press release that they were deeply affected by the loss of the young man and extended their condolences to the family. Additionally, she assured that the company takes the safety of its users very seriously, revealing that in the past six months, they have introduced various protective measures, including a notice that redirects users to the National Suicide Prevention Lifeline if terms related to self-harm or suicidal ideation are detected.