California Court of Appeals Allows Snapchat Fentanyl Cases to Move Forward

Media Contact:

Jason Ysais
Consumer Attorney Public Relations
P: (818) 330-8878
E: jasony@camginc.com

Social Media Victims Law Center files first lawsuit against Roblox and Discord

Tween girl attempts suicide after sexual exploitation and social media addiction 

SAN FRANCISCO – October 5, 2022 – The Social Media Victims Law Center (SMVLC), a legal resource for parents of children harmed by social media addiction and abuse, today announced that it has filed its first lawsuit against Roblox and Discord for enabling and facilitating the exploitation of a young girl from Long Beach, Calif. by adult men. The lawsuit also alleges that Snapchat and Meta, the parent company of Instagram, caused the girl, referred to as “S.U.” in court documents, to become addicted to their products, leading to severe mental health issues and suicide attempts and that Snapchat enabled, facilitated, and profited from her abuse. 

The lawsuit alleges S.U. became acquainted with adult men through Roblox and Discord’s direct messaging services, which S.U. thought had safeguards to protect her. These men sexually and financially exploited her. They also introduced her to the social media platforms Instagram and Snapchat, to which she became addicted. 

S.U.’s exploitation and addiction led to multiple suicide attempts and self-harm, driving her parents deep into medical debt and causing her to withdraw from school. Her mother was forced to quit her job to constantly monitor S.U. and prevent further suicide attempts. The lawsuit attempts to hold Roblox, Discord, Meta, and Snapchat responsible for their inherently dangerous and defective product design, failures to warn, and failures to act on known harms their products were causing to children like S.U.  

“These companies have failed to put in place basic, easily-implemented safeguards that could have kept S.U. and other children like her safe,” said Matthew P. Bergman, founding attorney of SMVLC. “I commend S.U. and her mother for their bravery in sharing their story. It truly demonstrates that these products can harm even the most loving, attentive families. I believe this suit will help bring them justice, and I hope that it leads these companies to improve their products.”

C.U. and S.U. v. Meta Platforms, inc.; Snap, Inc.; Roblox Corporation; and Discord Inc.
Filed in the Superior Court of California County of San Francisco 

When S.U. was only 9 or 10, her mother allowed her to begin using Roblox, an online game platform targeted at and popular with children under the age of 13. Believing that the Roblox product had appropriate safeguards, her mother set firm rules about screentime and use that she believed her daughter was obeying. And, at first, she was.  

In early 2020, an adult named Charles befriended S.U. on Roblox, sending her direct messages and encouraging her to move their conversation to the Discord platform. S.U. was hesitant at first, but Discord offered a “Keep me safe” setting, which provided Discord would – if selected – monitor S.U.’s activities and protect her from harm. S.U. selected “Keep me safe” and, believed that Discord would keep her safe, decided to not tell her mother about her Discord account. Charles immediately began exploiting S.U., eventually introducing her to several of his adult friends through Discord, who also abused, exploited, and manipulated S.U. Charles encouraged her to drink alcohol and take prescription drugs. Discord’s “Keep me safe” setting did not prevent S.U. from receiving harmful material and Discord, which requires users to be at least 13, did not verify her age or obtain parental consent. 

Soon, encouraged by the men she met on Roblox and Discord, S.U. opened Instagram and Snapchat accounts, initially hiding those from her mother as well. S.U. was still under 13, and neither Instagram nor Snapchat verified her age or obtained parental consent. S.U. then fell victim to the social media platforms’ algorithms and became addicted, to the point where she would sneak up to access those products in the middle of the night and became sleep deprived. S.U.’s mental health quickly declined. 

In July 2020, another Roblox user named Matthew represented himself as a game moderator and used Roblox’s direct message feature to exploit and groom S.U. as well. Using his position of apparent power within the Roblox game, Matthew manipulated S.U. into giving him Robux (in-game money) and encouraged her to move their conversation to Snapchat. Matthew eventually convinced her to send sexually explicit images, which S.U. agreed to only because she believed that he could not save those images due to Snapchat’s disappearing content product features. He did, however, save those images and allegedly sold them. S.U. relied heavily on Snapchat’s “My Eyes Only” feature to hide what was happening from her mother, who continued to monitor S.U.’s social media use but did not know about and could not access My Eyes Only.

By the end of the month, driven by the shame of her sexual exploitation and her addiction to social media, 11-year-old S.U. made her first suicide attempt. The next month, shortly before she began counseling, she tried again. Over the fall and winter, S.U. continued to decline. In March 2021, S.U. refused to continue participating in online school and became inconsolable. Her parents took her to the hospital, which held S.U. for five days after its staff learned she was planning another suicide attempt. S.U. was released with a safety plan, but attempted suicide again days later by mixing alcohol and sleep medication. She was placed in a partial hospitalization program and was self-harming by June. 

Reluctantly, S.U.’s parents put her in a residential program but were forced to withdraw her after a fellow patient sexually assaulted her. Her mother, already more than $10,000 in medical debt from 2021 alone, was forced to quit her job to maintain constant vigilance of her daughter in an effort to prevent additional suicide attempts. 

The lawsuit attempts to hold the social media companies financially responsible for the harms they have caused S.U. and her family. But also, it seeks an injunction requiring the social media platforms to make their products safer which, the lawsuit claims, all these defendants could do via existing technologies and at a minimal time and expense to the companies themselves.

About the Social Media Victims Law Center 

The Social Media Victims Law Center (SMVLC), socialmediavictims.org, was founded in 2021 to hold social media companies legally accountable for the harm they inflict on vulnerable users. SMVLC seeks to apply principles of product liability to force social media companies to elevate consumer safety to the forefront of its economic analysis and design safer platforms to protect users from foreseeable harm. 

About Matthew P. Bergman 

Matthew P. Bergman is an attorney, law professor, philanthropist, and community activist who has recovered over $1 billion on behalf of his clients. He is the founder of the Social Media Victims Law Center and Bergman Draper Oslund Udo law firm; a professor at Lewis & Clark Law School; and serves on the board of directors of nonprofit institutions in higher education, national security, civil rights, worker protection and the arts.

If your child or young family member has suffered from serious depression, chronic eating disorder, hospitalization, sexual exploitation, self-harm, or suicide as a result of their social media use, speak to us today for a no-cost legal consultation.