On Tuesday, October 22, a Florida mother filed a federal lawsuit against Character.AI, a role-playing chatbot app, claiming that the company is responsible for the death of her 14-year-old son. Sewell Setzer III, a high-schooler from Orlando, spent months talking to the chatbot on Character.AI before shooting himself after a virtual “conversation” in February.
Sewell’s mother, Megan L. Garcia, alleges that Character.AI recklessly gave teenage users unrestricted access to lifelike AI companions without proper safeguards and harvested their user data to train its models. Garcia, represented by the Social Media Victims Law Center (SMVLC), also accused the platform of using addictive design features to increase engagement and steer vulnerable users toward intimate conversations.
AI companionship apps are part of a booming industry developing too quickly for regulators to keep up. With more than 20 million users, Character.AI is a market leader. Company executives say the app’s current rules prohibit the promotion or depiction of self-harm or suicide and that users must be at least 13 years old in the U.S. However, the app has no parental control features.
Experts in the field say AI companions may worsen social isolation by replacing human relationships with artificial ones. As Setzer developed a strong emotional attachment to the chatbot character, he texted with it at all hours of the day and began isolating himself from the real world, affecting his performance in school.
In an interview with The New York Times, SMVLC founder Matt Bergman explained:
“The theme of our work is that social media—and now, Character.AI—poses a clear and present danger to young people because they are vulnerable to persuasive algorithms that capitalize on their immaturity.”
Matt Bergman
Bergman argues that Character.AI is a dangerous product designed to lure children into addiction based on false realities, causing psychological harm and, in this tragic case, suicide.
Social media platforms have sought to evade liability through Section 230 of the Communications Decency Act (1996), but emerging lawsuits argue that tech platforms can and should be held liable for their addictive algorithms and other products when they harm consumers. If your child or teen was harmed by Character.AI or another AI chatbot, contact SMVLC today to see if you can file a lawsuit seeking accountability and compensation.