Read the latest updates on social media addiction lawsuits in 2025

AI Chatbot Lawsuit Clears Key Legal Hurdle in Landmark Case

AI Chatbot Lawsuit Clears Key Legal Hurdle in Landmark Case

A federal judge ruled on May 21 that a lawsuit against Character.AI over a teen user’s suicide can proceed. While Judge Anne Conway dismissed one of the suit’s claims, she denied the defendants’ motion to dismiss the rest. This lawsuit, which was filed by the teen’s mother, could potentially reshape how the law views AI platform accountability.

Key Ruling Points

  • First Amendment Defenses: The judge rejected the defendants’ First Amendment arguments, stating that it’s too early to rule whether AI chatbot output qualifies as protected speech.
  • Product Liability Claims: Because the court ruled that Character.AI may be treated as a “product,” the plaintiff’s product liability arguments may now proceed.
  • Emotional Distress Claims: The judge granted the defendants’ motion to dismiss the plaintiff’s claim of intentional infliction of emotional distress, finding that none of the mother’s allegations involve the type of “outrageous conduct” necessary to support the claim.

Case Background

In February 2025, 14-year-old Sewell Setzer III committed suicide after forming an emotional bond with an AI chatbot on Character.AI, an app that allows users to interact with chatbots. His mother, Megan Garcia, sued Character.AI and its founders for causing the death of her son, alleging product liability, intentional infliction of emotional distress, unjust enrichment, wrongful death, and more. She is represented by attorney Matthew Bergman, founder of the Social Media Victims Law Center (SMVLC), which is spearheading the legal fight.

Matt Bergman
Matthew P. Bergman

SMVLC FOUNDER

“It was a period of abuse extending for 8 months in which he was subjected to highly appropriate sexual content and learned and encouraged to develop a relationship with this character and so you have to look at it in its totality and see the just completely devastating effect it had on his mental health.” – Quoted on Fox35 Orlando

Why It Matters

Garcia’s case is one of the first major lawsuits to allege that AI interactions can directly contribute to emotional harm or death, especially among children. The judge’s ruling is a key early win for families and advocates pushing for stronger tech platform safety standards. In the future, the court’s decision on whether an LLM’s output counts as protected speech under the First Amendment could have major implications for similar cases. SMVLC’s mission is to hold AI and social media companies accountable for the psychological risks they pose to minors.

Matt Bergman
Matthew P. Bergman

SMVLC FOUNDER

“Communication from a chatbot is not a human expression,” said Bergman. “It’s not speech. It is a machine talking. One could no more say that Character.ai has a right to free expression as one could say a cat does or a dog does, or a talking robot.” – Quoted on Fox35 Orlando

Contact Us Today

Is Your Child Addicted to
Social Media?

You Are Not Alone.

We have filed over 3,000 social media addiction lawsuits for families just like yours.