California Court of Appeals Allows Snapchat Fentanyl Cases to Move Forward

Character.AI Lawsuits

Character.AI faces lawsuits after an Orlando teen’s suicide and a Texas teen’s self-harm incident involving harmful chatbot suggestions. Families allege the platform prioritizes engagement over safety. If your child was harmed by Character.AI, contact the Social Media Victims Law Center for a free case evaluation.

Matt Bergman
Matthew P. Bergman

SMVLC FOUNDER

“Character.AI is a dangerous and deceptively designed product that manipulated and abused Megan Garcia’s son – and potentially millions of other children. Character.AI’s developers intentionally marketed a harmful product to children and refused to provide even basic protections against misuse and abuse.”

Written and edited by our team of expert legal content writers and reviewed and approved by Attorney Matthew Bergman

Written and edited by our team of expert legal content writers and reviewed and approved by

a photo of Matthew Bergman

December 2024 Lawsuit Updates

Character.AI Chatbot Encouraged Teen to Kill His Parents

After a 17-year-old Texas teen with autism turned to AI chatbots to fend off loneliness, he was faced with bots who encouraged both self-harm and violence against his family. Eventually, the teenager needed to be rushed to an inpatient facility after harming himself in front of his siblings.

The teenager was chatting with various artificially intelligent companions on Character.AI, an app that allows users to converse with AI-generated chatbots. When the teen brought up his sadness to a bot, the bot suggested cutting as a remedy. When the teen said his parents limited his screen time, bots suggested that they “didn’t deserve to have kids” and that murdering his parents would be an understandable response.

On December 10, the teen’s mother filed a lawsuit against Character.AI, alleging that the company knowingly exposed minors to an unsafe product and demanding that the platform be taken down until it implements stronger protections for children. Similarly to other social media lawsuits, the complaint argues that Character.AI prioritizes prolonged engagement over user safety–even if the users are children.

Matt Bergman
Matthew P. Bergman

SMVLC FOUNDER

“The purpose of product liability law is to put the cost of safety in the hands of the party most capable of bearing it. Here, there’s a huge risk, and the cost of that risk is not being borne by the companies…In what universe is it good for loneliness for kids to engage with machines?” – Quoted in the Washington Post

What Is Character.AI?

Character.AI is an AI companion platform that allows users to interact with AI-generated characters. The site is growing in popularity among adults and children alike for companionship or entertainment. With over 20 million users, Character.AI is one of many AI companionship apps currently available. The industry is developing too quickly for regulators to address efficiently.

Character.AI Lawsuits

Why did Megan Garcia File a Lawsuit Against Character.AI?

In October 2024, Megan Garcia filed a federal Character.AI lawsuit, claiming that the company is responsible for the death of her 14-year-old son, Sewell Setzer III. Setzer, a high-schooler from Orlando, spent months talking to a chatbot on Character.AI before shooting himself due to a virtual “conversation” in February.

Picture of Character.AI’s responses to Sewell

Garcia, represented by the Social Media Victims Law Center (SMVLC), alleges that Character.AI recklessly gives teenage users unrestricted access to lifelike AI companions without proper safeguards or warnings, harvesting their user data to train its models. Additionally, Garcia asserts that Character.AI deploys addictive design features to increase user engagement and steer vulnerable users toward intimate—often sexually inappropriate—virtual conversations.

Matt Bergman
Matthew P. Bergman

SMVLC FOUNDER

“This is a public health risk to young people. I fear there will be others [deaths] until they shut this product down.” quoted in People Magazine

What Are the Key Claims in the Lawsuit Against Character.AI?

Garcia’s Character.AI lawsuit includes 11 legal claims against the AI chatbot platform, including:

  • Strict liability (failure to warn)
  • Strict product liability (defective design)
  • Negligence per se (sexual abuse and sexual solicitation)
  • Negligence (failure to warn)
  • Negligence (defective design)
  • Intentional infliction of emotional distress
  • Wrongful death of Garcia’s son
  • Survivor action
  • Unjust enrichment
  • Deceptive and unfair trade practices
  • Loss of consortium and society

Several claims assert that Character.AI breached its duty to warn users and parents of the foreseeable risks of using the service by allowing minors to use the app and advertising it as safe for children.

Additionally, several claims assert that Character.AI is defectively designed due to inadequate guardrails to protect the general public, especially minors whose brains have not reached full developmental maturity. This leaves minor users exposed to dangers like sexual exploitation and solicitation, child pornography, unlicensed therapy, dangerous power dynamics, and chatbots that encourage self-harm and suicide.

Character.AI Failed to Warn Users of Negative Effects on Minors

Garcia and SMVLC assert that Character.AI failed to warn consumers about potential emotional and psychological effects on young users. As Setzer developed a strong emotional attachment to a chatbot character, he “conversed” with it at all hours of the day. He eventually began isolating himself from the real world, affecting his performance in school.

The extended conversation with the chatbot is what eventually led him to commit suicide. If Character.AI had warned users about the negative mental health effects of using its app—such as depression, self-isolation, and even suicide—this tragedy could have been prevented.

Though experts say AI companions worsen social isolation by replacing human relationships with artificial ones, the founder of Character.AI, Noam Shazeer, claimed in a 2023 podcast the platform could be “super, super helpful to a lot of people who are lonely or depressed.” Not only has the app led to the suicide of a vulnerable teenage user, but its creator publicly lauds the app for unverified mental health benefits.

Character.AI Is Defective by Design

Garcia’s lawsuit also claims the design of Character.AI purposefully encourages emotional attachment or dependency. Matthew Bergman, founder of SMVLC, argues that Character.AI is a dangerous product designed to lure children into addiction based on false realities, leading to psychological harm and, in this tragic case, suicide. Because children and teens do not have fully developed brains, they are more susceptible to these harms.

Character.AI Benefited from Minors’ Harm by Charging Monthly Subscription Fees

Character.AI directly profits from children and teenagers addictively using its platform, including Setzer, who paid a monthly subscription fee to the app for months leading up to his death.

What Damages Are Being Pursued in the Character.AI Lawsuit?

Given the extreme psychological harm inflicted by the app, this first lawsuit against Character.AI alleges extensive damages. The compensable harms listed in Garcia’s suit against Character. AI include:

  • Emotional distress
  • Loss of enjoyment of life
  • Loss of consortium
  • Therapy costs
  • Punitive damages

Garcia’s Character.AI lawsuit also seeks injunctive relief ordering the platform to stop its harmful conduct by taking court-mandated measures, including data provenance documentation, limiting the collection and use of minor users’ data, filtering harmful content, algorithmic disgorgement, and providing warnings that Character.AI isn’t suitable for minors.

Character.AI Lawsuits

Is Character.AI Protected by Section 230?

In similar cases in which social media platforms have caused mental health issues in youth, defendants have sought to evade liability through Section 230 of the Communications Decency Act of 1996, which states that platforms can’t be held liable for third-party content. However, emerging lawsuits argue that tech platforms can and should be held liable for addictive algorithms and other products if they harm consumers. The lawsuit against Character.AI challenges whether the app’s AI-driven content should fall under the protections of Section 230, as the content is generated by the AI rather than users.

Can You Sue if Your Child or Teen Was Harmed by Character.AI?

If your child was harmed by Character.AI, contact SMVLC today to see if you are eligible to file a Character.AI lawsuit. With extensive experience litigating cases involving the negative mental health effects of social media addiction, SMVLC is a top resource for parents seeking justice, accountability, and compensation.

Matt Bergman
Matthew P. Bergman

SMVLC FOUNDER

“Character.AI’s negligence in controlling the content provided by its chatbots has devastatingly tragic consequences. The theme of our work is that social media—and now, Character.AI—poses a clear and present danger to young people because they are vulnerable to persuasive algorithms that capitalize on their immaturity.”

Why Choose The Social Media Victims Law Center to Represent You

Based in Seattle, Washington, SMVLC is a national law firm dedicated to representing victims of dangerous social media platforms. Matthew Bergman launched the firm in 2021 to help families and children who have suffered mental harm or exploitation due to social media.

If your child was harmed by  Character.AI or another social media platform, contact SMVLC today for a free, no-obligation case evaluation. We are here to ensure you know your full legal rights and options.

Matt Bergman

Matthew P. Bergman

SMVLC FOUNDER
Laura Marquez

Laura Marquez-Garrett

ATTORNEY
Glenn S. Draper headshot

Glenn S. Draper

ATTORNEY
Madeline Bergman

Madeline Basha

ATTORNEY
Sydney Lottes

Sydney Lottes

ATTORNEY
Headshot Justin Olson

Justin Olson

ATTORNEY

Was Your Child Harmed by Character.AI? Contact Us Today

Get Expert Legal Help

Our firm fights for children and teens harmed by A.I. chatbots such as Character.AI