Discord Lawsuit

The Social Media Victims Law Center has filed a lawsuit against Roblox and Discord for their role in facilitating the exploitation of a minor who believed Discord’s “keep me safe” setting would protect her from predators. Unfortunately, our client is one of many young users who have been harmed on Discord’s platform.

According to Discord’s own transparency report, during the second quarter of 2022, 532,498 accounts were disabled for child safety violations, with 497,267 of those accounts involving sexualized content depicting minors. An additional 106,645 accounts were disabled for exploitative and unsolicited content.

In addition to sexually abusive and exploitative content, young users of Discord may also be exposed to content depicting self-harm, harassment, and graphic violence. Discord routinely removes these accounts, but the number of accounts removed in just one three-month period is in the thousands.

What is Discord?

Discord is a communication app originally created to allow members of the gaming community to network with like-minded gaming enthusiasts, even while playing. It has since expanded to include a wider audience of anyone who wishes to communicate with people with similar hobbies and interests.

Users who join Discord can create communities known as servers. Servers may be established for text communication or voice communication. It is not possible to search for servers unless they have more than 200 users and the administrator makes them public. This makes it difficult for parents to monitor their teens on Discord.

Is Discord safe for kids?

Discord is not safe for kids in its current format. Private servers enable predators to talk to children through text, voice, or video chat. Voice chats are not recorded, nor is any other record of these conversations preserved, creating challenges for law enforcement and parents investigating activities that occur on teens’ Discord accounts.

How is Discord dangerous to children and teens?

While Discord claims to actively police and moderate its platform, it does not go far enough to protect teens and tweens. The National Center for Sexual Exploitation includes Discord on its Dirty Dozen List due to its failure to “adequately address sexually exploitative content and activity on its platform.”

Lack of Age Verification

According to the terms of service, users must be 13 years of age or older, but there is no meaningful age verification protocol.

Insufficient Moderation

Discord relies largely on users to report harmful content rather than proactively moderating it themselves. The result is that entire servers are never monitored.

Inadequate Response to Parental Concerns

Amanda Schneider told CNN Business that her 13-year-old son was pursued by an older man on Discord for an inappropriate relationship. According to Amanda, the platform refused to help her when she reported this to Discord. She later learned from police that the man was a registered sex offender.

Ineffective Safeguards

Discord allows minors to receive unsolicited communications and does nothing to prevent minors from entering channels that host explicit or otherwise harmful content.

Lack of Parental Controls

Discord does not offer parental controls. These need to be added due to the rampant adult content on the site and the large number of young users, including users who are younger than 13.

Improper Default Privacy Settings for Minor Accounts

Discord offers privacy settings that restrict users from receiving messages from people unknown to them, but minor accounts do not default to these settings. This prevents parents who are not tech-savvy from ensuring their children’s accounts are set for safety.

Our Current Lawsuit Against Discord

The lawsuit filed against Discord by the Social Media Victims Law Center is on behalf of a minor who repeatedly attempted suicide at the age of 11 years old after being exploited by multiple men on Discord. An older male initially approached her on Roblox, a gaming platform for children under 13.

The man persuaded the child to move the conversation to Discord, where our client was exploited by multiple men. Although Discord has an age requirement of 13 years or older, our client was misled by Discord’s “keep me safe” setting to believe Discord would protect her from harm.

However, the platform failed to protect her from exploitation and harmful content. It also did not attempt to verify her age.

Can I file a social media lawsuit against Discord?

If your child has been harmed by using Discord, you may have grounds to file a lawsuit against the platform. Social media companies like Discord need to be held accountable. Otherwise, they will never be motivated to do the right thing and keep young people safe on their platforms.

Social media platforms claim they cannot be sued because Section 230 protects them from liability for what other people post. This is true, but the issue with social media companies is not the behavior of users but of the social media companies themselves.

Strict Liability

The doctrine of strict liability applies to product liability, in which a product manufacturer can be held liable for harm consumers experience as a result of the following:

  • Inherently defective product design
  • Manufacturing defect
  • Failure to warn or adequately instruct

Discord has provided a dangerous product that is accessible to teens and children without parental controls. It also does not provide parents with adequate warnings or instructions to protect their children.

Negligence

Negligence occurs in personal injury law when harm ensues as a result of a defendant breaching a duty of care. Discord’s negligent conduct includes the following:

  • Failure to verify a user’s age
  • Failure to adequately moderate content posts
  • Failure to promptly and appropriately respond to parental reports of harmful and exploitative content

What about the arbitration clause in the terms and conditions?

Discord adopted a binding arbitration requirement in 2018. Arbitration is an alternate form of dispute resolution that is heard by an arbitrator rather than a judge. The decision in arbitration is legally binding as if it were a court decision. Binding arbitration is often favorable to the company that requires it.

However, children under the age of 18 cannot legally enter into contracts. This was affirmed by a U.S. District Court in California in R.A. v. Epic Games, Inc. You do not have to allow an arbitration clause to stand in the way of justice for your child. An experienced social media attorney can provide the most specific guidance about your right to sue.

What should I do if my child has been harmed because of Discord?

If your child has been harmed by Discord, report the harm to Discord’s Trust & Safety team and send Discord an email requesting your child’s account be disabled. If you believe a crime has been committed against your child, contact your local law enforcement agency.

You may need to spend significant time talking with your child to determine whether your child has other social media accounts that could be causing harm. Your child may need outside mental health counseling as a result of the following effects of social media:

If you believe your child has been harmed by the effects of Discord or any other social media platform, contact the Social Media Victims Law Center today for a free and confidential consultation.

Ellipse 3
Content Reviewed by:

Matthew P. Bergman

Add Your Heading Text Here

Contact Us Today

Related Pages

Client Testimonials

Sue H.
People who are taking care of loved ones who are traumatized can’t stop thinking what they are going through. One reason it was so invaluable we had Matt in our lives is I was so completely distraught over what was happening that it was wonderful to know that someone in the background was making certain that things were being taking care of. I will always be grateful for that..
Richard M.
I would like to thank you and your staff for your continued efforts and support in assisting our family. . . Your firms’ relentless pursuit for justice is greatly appreciated.