Character.AI Lawsuits
After the suicide of an Orlando teenager, Character.AI, an AI chatbot program that is growing in popularity among young people, is now facing a lawsuit for defective design and failure to warn parents about the harm the platform can cause. If your child was harmed by Character.AI, contact the Social Media Victims Law Center today for a free case evaluation.
SMVLC FOUNDER
“Character.AI is a dangerous and deceptively designed product that manipulated and abused Megan Garcia’s son – and potentially millions of other children. Character.AI’s developers intentionally marketed a harmful product to children and refused to provide even basic protections against misuse and abuse.”
Written and edited by our team of expert legal content writers and reviewed and approved by Attorney Matthew Bergman
- Content last updated on:
- November 22, 2024
Written and edited by our team of expert legal content writers and reviewed and approved by
- Content last updated on:
- November 22, 2024
- What Is Character.AI?
- Why did Megan Garcia File a Lawsuit Against Character.AI?
- What Are the Key Claims in the Lawsuit Against Character.AI?
- What Damages Are Being Pursued in the Character.AI Lawsuit?
- Is Character.AI Protected by Section 230?
- Can You Sue if Your Child or Teen Was Harmed by Character.AI?
- Why Choose The Social Media Victims Law Center to Represent You
What Is Character.AI?
Character.AI is an AI companion platform that allows users to interact with AI-generated characters. The site is growing in popularity among adults and children alike for companionship or entertainment. With over 20 million users, Character.AI is one of many AI companionship apps currently available. The industry is developing too quickly for regulators to address efficiently.
Why did Megan Garcia File a Lawsuit Against Character.AI?
In October 2024, Megan Garcia filed a federal Character.AI lawsuit, claiming that the company is responsible for the death of her 14-year-old son, Sewell Setzer III. Setzer, a high-schooler from Orlando, spent months talking to a chatbot on Character.AI before shooting himself due to a virtual “conversation” in February.
Garcia, represented by the Social Media Victims Law Center (SMVLC), alleges that Character.AI recklessly gives teenage users unrestricted access to lifelike AI companions without proper safeguards or warnings, harvesting their user data to train its models. Additionally, Garcia asserts that Character.AI deploys addictive design features to increase user engagement and steer vulnerable users toward intimate—often sexually inappropriate—virtual conversations.
SMVLC FOUNDER
“This is a public health risk to young people. I fear there will be others [deaths] until they shut this product down.” quoted in People Magazine
What Are the Key Claims in the Lawsuit Against Character.AI?
Garcia’s Character.AI lawsuit includes 11 legal claims against the AI chatbot platform, including:
- Strict liability (failure to warn)
- Strict product liability (defective design)
- Negligence per se (sexual abuse and sexual solicitation)
- Negligence (failure to warn)
- Negligence (defective design)
- Intentional infliction of emotional distress
- Wrongful death of Garcia’s son
- Survivor action
- Unjust enrichment
- Deceptive and unfair trade practices
- Loss of consortium and society
Several claims assert that Character.AI breached its duty to warn users and parents of the foreseeable risks of using the service by allowing minors to use the app and advertising it as safe for children.
Additionally, several claims assert that Character.AI is defectively designed due to inadequate guardrails to protect the general public, especially minors whose brains have not reached full developmental maturity. This leaves minor users exposed to dangers like sexual exploitation and solicitation, child pornography, unlicensed therapy, dangerous power dynamics, and chatbots that encourage self-harm and suicide.
Character.AI Failed to Warn Users of Negative Effects on Minors
Garcia and SMVLC assert that Character.AI failed to warn consumers about potential emotional and psychological effects on young users. As Setzer developed a strong emotional attachment to a chatbot character, he “conversed” with it at all hours of the day. He eventually began isolating himself from the real world, affecting his performance in school.
The extended conversation with the chatbot is what eventually led him to commit suicide. If Character.AI had warned users about the negative mental health effects of using its app—such as depression, self-isolation, and even suicide—this tragedy could have been prevented.
Though experts say AI companions worsen social isolation by replacing human relationships with artificial ones, the founder of Character.AI, Noam Shazeer, claimed in a 2023 podcast the platform could be “super, super helpful to a lot of people who are lonely or depressed.” Not only has the app led to the suicide of a vulnerable teenage user, but its creator publicly lauds the app for unverified mental health benefits.
Character.AI Is Defective by Design
Garcia’s lawsuit also claims the design of Character.AI purposefully encourages emotional attachment or dependency. Matthew Bergman, founder of SMVLC, argues that Character.AI is a dangerous product designed to lure children into addiction based on false realities, leading to psychological harm and, in this tragic case, suicide. Because children and teens do not have fully developed brains, they are more susceptible to these harms.
Character.AI Benefited from Minors’ Harm by Charging Monthly Subscription Fees
Character.AI directly profits from children and teenagers addictively using its platform, including Setzer, who paid a monthly subscription fee to the app for months leading up to his death.
What Damages Are Being Pursued in the Character.AI Lawsuit?
Given the extreme psychological harm inflicted by the app, this first lawsuit against Character.AI alleges extensive damages. The compensable harms listed in Garcia’s suit against Character. AI include:
- Emotional distress
- Loss of enjoyment of life
- Loss of consortium
- Therapy costs
- Punitive damages
Garcia’s Character.AI lawsuit also seeks injunctive relief ordering the platform to stop its harmful conduct by taking court-mandated measures, including data provenance documentation, limiting the collection and use of minor users’ data, filtering harmful content, algorithmic disgorgement, and providing warnings that Character.AI isn’t suitable for minors.
Is Character.AI Protected by Section 230?
In similar cases in which social media platforms have caused mental health issues in youth, defendants have sought to evade liability through Section 230 of the Communications Decency Act of 1996, which states that platforms can’t be held liable for third-party content. However, emerging lawsuits argue that tech platforms can and should be held liable for addictive algorithms and other products if they harm consumers. The lawsuit against Character.AI challenges whether the app’s AI-driven content should fall under the protections of Section 230, as the content is generated by the AI rather than users.
Can You Sue if Your Child or Teen Was Harmed by Character.AI?
If your child was harmed by Character.AI, contact SMVLC today to see if you are eligible to file a Character.AI lawsuit. With extensive experience litigating cases involving the negative mental health effects of social media addiction, SMVLC is a top resource for parents seeking justice, accountability, and compensation.
SMVLC FOUNDER
“Character.AI’s negligence in controlling the content provided by its chatbots has devastatingly tragic consequences. The theme of our work is that social media—and now, Character.AI—poses a clear and present danger to young people because they are vulnerable to persuasive algorithms that capitalize on their immaturity.”
Why Choose The Social Media Victims Law Center to Represent You
Based in Seattle, Washington, SMVLC is a national law firm dedicated to representing victims of dangerous social media platforms. Matthew Bergman launched the firm in 2021 to help families and children who have suffered mental harm or exploitation due to social media.
If your child was harmed by Character.AI or another social media platform, contact SMVLC today for a free, no-obligation case evaluation. We are here to ensure you know your full legal rights and options.
Was Your Child Harmed by Character.AI? Contact Us Today
Our firm fights for children and teens harmed by A.I. chatbots such as Character.AI