California Court of Appeals Allows Snapchat Fentanyl Cases to Move Forward

Discord Lawsuit

The Social Media Victims Law Center has filed a lawsuit against Discord and Roblox for their role in facilitating the exploitation of a minor who believed Discord’s “keep me safe” setting would protect her from predators. Unfortunately, our client is one of many young users who have been harmed on Discord’s platform.

Chat emoji on a illuminated keyboard key

Written and edited by our team of expert legal content writers and reviewed and approved by Attorney Matthew Bergman

Written and edited by our team of expert legal content writers and reviewed and approved by

a photo of Matthew Bergman

According to Discord’s own transparency report, during the second quarter of 2022, 532,498 accounts were disabled for child safety violations, with 497,267 of those accounts involving sexualized content depicting minors. An additional 106,645 accounts were disabled for exploitative and unsolicited content.

In addition to sexually abusive and exploitative content, young users of Discord may also be exposed to content depicting self-harm, harassment, and graphic violence. Discord routinely removes these accounts, but the number of accounts removed in just one three-month period is in the thousands.

What is Discord?

Discord is a communication app originally created to allow members of the gaming community to network with like-minded gaming enthusiasts, even while playing. It has since expanded to include a wider audience of anyone who wishes to communicate with people with similar hobbies and interests.

Users who join Discord can create communities known as servers. Servers may be established for text communication or voice communication. It is not possible to search for servers unless they have more than 200 users and the administrator makes them public. This makes it difficult for parents to monitor their teens on Discord.

Is Discord safe for kids?

Discord is not safe for kids in its current format. Private servers enable predators to talk to children through text, voice, or video chat. Voice chats are not recorded, nor is any other record of these conversations preserved, creating challenges for law enforcement and parents investigating activities that occur on teens’ Discord accounts.

How is Discord dangerous to children and teens?

While Discord claims to actively police and moderate its platform, it does not go far enough to protect teens and tweens. The National Center for Sexual Exploitation includes Discord on its Dirty Dozen List due to its failure to “adequately address sexually exploitative content and activity on its platform.”

Lack of Age Verification

According to the terms of service, users must be 13 years of age or older, but there is no meaningful age verification protocol.

Insufficient Moderation

Discord relies largely on users to report harmful content rather than proactively moderating it themselves. The result is that entire servers are never monitored.

Inadequate Response to Parental Concerns

Amanda Schneider told CNN Business that her 13-year-old son was pursued by an older man on Discord for an inappropriate relationship. According to Amanda, the platform refused to help her when she reported this to Discord. She later learned from police that the man was a registered sex offender.

Ineffective Safeguards

Discord allows minors to receive unsolicited communications and does nothing to prevent minors from entering channels that host explicit or otherwise harmful content.

Lack of Parental Controls

Discord does not offer parental controls. These need to be added due to the rampant adult content on the site and the large number of young users, including users who are younger than 13.

Improper Default Privacy Settings for Minor Accounts

Discord offers privacy settings that restrict users from receiving messages from people unknown to them, but minor accounts do not default to these settings. This prevents parents who are not tech-savvy from ensuring their children’s accounts are set for safety.

Lawsuits Against Discord

Our Current Lawsuit Against Discord

In October 2022, the Social Media Victims Law Center filed a lawsuit against the social media platform Discord and the gaming giant Roblox on behalf of the family of an 11-year-old girl who attempted suicide multiple times after allegedly being financially and sexually exploited by adult men she met through the apps. The lawsuit charges that Discord and Roblox facilitated and enabled the victim’s communication with child sexual predators.

The lawsuit also names Snapchat and Meta—the parent company that owns Instagram—as defendants. It alleges the design of those companies’ products led the Long Beach, California, girl to develop a severe social media addiction, causing her to sneak out of bed at night to access the apps and develop unhealthy sleeping habits. This allegedly contributed to the eventual decline of her mental health. The suit also maintains that Snapchat profited from the girl’s abuse.

Timeline of Events

According to the lawsuit, when the victim was 9 or 10 years old, her mother allowed her to use the popular online gaming platform Roblox, which is aimed at children under 13. The mother believed that appropriate safeguards were in place to protect young children from communicating with adults on the platform. However, adult users were able to communicate with underage players.

Many advocates are concerned that social media algorithms are addictive, especially when users are young and vulnerable. Discord, which requires users to be 13 to have an account, does not verify users’ ages or obtain parental consent before allowing minors to create an account. 

The victim selected Discord’s “Keep Me Safe” setting, believing it would monitor her activities and protect her from harm. The lawsuit alleges the following series of events: 

How did the platforms fail to protect the victim?

The companies that own Roblox, Discord, Snapchat, and Instagram failed to protect a vulnerable child from exploitation by predatory adults. 

Their products’ designs also made it impossible for the victim’s parents to intervene and stop the abuse because they either weren’t aware the accounts existed or weren’t allowed access to certain features. The complaint alleges a few key ways that social media companies failed to keep the victim safe in this case: 

What are the damages alleged in the Discord lawsuit?

Civil litigation can compensate victims harmed by a product. In this case, the victim suffered a severe mental health crisis that resulted in acts of self-harm and several suicide attempts. She missed school and required long-term hospitalization. Her parents are seeking compensation to cover the cost of her medical bills, which totaled tens of thousands of dollars.

In addition to medical debt, the victim and her family are seeking compensation for other financial and emotional damages based on the following:

What does the Discord lawsuit hope to accomplish?

Social Media Victims Law Center is holding social media companies legally accountable for the harm inflicted on vulnerable users through:

Social media companies already have the technology to remedy these problems with minimal cost and effort. But product liability lawsuits are the only way to force them to place consumer safety above economic gain and design safer platforms to protect users from harmful content and abuse. The Discord lawsuit requests an injunction requiring social media platforms to make their products safer. 

“These companies have failed to put in place basic, easily implemented safeguards that could have kept [the victim] and other children like her safe,” said Matthew P. Bergman, founding attorney of Social Media Victims Law Center. “I commend [the victim] and her mother for their bravery in sharing their story.”

“It truly demonstrates that these products can harm even the most loving, attentive families,” Bregman added. “I believe this suit will help bring them justice, and I hope that it leads these companies to improve their products.”

Can I file a social media lawsuit against Discord?

If your child has been harmed by using Discord, you may have grounds to file a lawsuit against the platform. Social media companies like Discord need to be held accountable. Otherwise, they will never be motivated to do the right thing and keep young people safe on their platforms.

Social media platforms claim they cannot be sued because Section 230 protects them from liability for what other people post. This is true, but the issue with social media companies is not the behavior of users but of the social media companies themselves.

Strict Liability

The doctrine of strict liability applies to product liability, in which a product manufacturer can be held liable for harm consumers experience as a result of the following:

  • Inherently defective product design
  • Manufacturing defect
  • Failure to warn or adequately instruct

Discord has provided a dangerous product that is accessible to teens and children without parental controls. It also does not provide parents with adequate warnings or instructions to protect their children.

Negligence

Negligence occurs in personal injury law when harm ensues as a result of a defendant breaching a duty of care. Discord’s negligent conduct includes the following:

  • Failure to verify a user’s age
  • Failure to adequately moderate content posts
  • Failure to promptly and appropriately respond to parental reports of harmful and exploitative content

What about the arbitration clause in the terms and conditions?

Discord adopted a binding arbitration requirement in 2018. Arbitration is an alternate form of dispute resolution that is heard by an arbitrator rather than a judge. The decision in arbitration is legally binding as if it were a court decision. Binding arbitration is often favorable to the company that requires it.

However, children under the age of 18 cannot legally enter into contracts. This was affirmed by a U.S. District Court in California in R.A. v. Epic Games, Inc. You do not have to allow an arbitration clause to stand in the way of justice for your child. An experienced social media attorney can provide the most specific guidance about your right to sue.

What should I do if my child has been harmed because of Discord?

If your child has been harmed by Discord, report the harm to Discord’s Trust & Safety team and send Discord an email requesting your child’s account be disabled. If you believe a crime has been committed against your child, contact your local law enforcement agency.

You may need to spend significant time talking with your child to determine whether your child has other social media accounts that could be causing harm. Your child may need outside mental health counseling as a result of the following effects of social media:

If you believe your child has been harmed by the effects of Discord or any other social media platform, contact the Social Media Victims Law Center today for a free and confidential consultation.

Discord Lawsuit Case Review

Get Expert Legal Help

Was your child or teen harmed by Discord? You may have legal options.