Mom Sues AI Chatbot Character.AI in Federal Lawsuit After Sons Death

a teenager using an AI chat bot
Representing teens harmed by Character.AI and AI ChatBots

On Tuesday, October 22, a Florida mother filed a federal lawsuit against Character.AI, a role-playing chatbot app, claiming that the company is responsible for the death of her 14-year-old son. Sewell Setzer III, a high-schooler from Orlando, spent months talking to the chatbot on Character.AI before shooting himself after a virtual “conversation” in February.

Sewell’s mother, Megan L. Garcia, alleges that Character.AI recklessly gave teenage users unrestricted access to lifelike AI companions without proper safeguards and harvested their user data to train its models. Garcia, represented by the Social Media Victims Law Center (SMVLC), also accused the platform of using addictive design features to increase engagement and steer vulnerable users toward intimate conversations.

AI companionship apps are part of a booming industry developing too quickly for regulators to keep up. With more than 20 million users, Character.AI is a market leader. Company executives say the app’s current rules prohibit the promotion or depiction of self-harm or suicide and that users must be at least 13 years old in the U.S. However, the app has no parental control features.

Experts in the field say AI companions may worsen social isolation by replacing human relationships with artificial ones. As Setzer developed a strong emotional attachment to the chatbot character, he texted with it at all hours of the day and began isolating himself from the real world, affecting his performance in school.

In an interview with The New York Times, SMVLC founder Matt Bergman explained:

Bergman argues that Character.AI is a dangerous product designed to lure children into addiction based on false realities, causing psychological harm and, in this tragic case, suicide.

Social media platforms have sought to evade liability through Section 230 of the Communications Decency Act (1996), but emerging lawsuits argue that tech platforms can and should be held liable for their addictive algorithms and other products when they harm consumers. If your child or teen was harmed by Character.AI or another AI chatbot, contact SMVLC today to see if you can file a lawsuit seeking accountability and compensation.

Contact Us Today

Is Your Child Addicted to
Social Media?

You Are Not Alone.

We have filed over 3,000 social media addiction lawsuits for families just like yours.