Social media platforms hide behind Section 230 of the Communications Decency Act to avoid liability when they fail to take reasonable measures to protect their users from harm. This outdated law was passed in 1996, before the rise of social media, and must be changed to hold social media companies accountable when their platforms cause harm.
Written and edited by our team of expert legal content writers and reviewed and approved by
Attorney Matthew Bergman
Section 230 allows internet companies to host user content without being held liable for it. This benefits small businesses, such as private blogs and newspapers, as well as large social media companies that offer user-generated content.
The overarching purpose of Section 230 is to protect free speech on the internet. The general public understands that laws against yelling “Fire!” in a movie theater do not violate free speech but protect the public from harm. Section 230 was useful when it was created, but internet activity has changed, and the laws must keep up with implementations to protect the public.
What is Section 230?
The protections provided by 47 U.S. Code § 230 are listed below:
Treatment of Publisher or Speaker
According to (c)(1) of § 230: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
According to (c)(2) of § 230, internet companies are protected from liability if they moderate and remove content. Before Section 230, companies were held accountable for content posted by users on their sites if they moderated any of the content.
Additionally, (e)(3) of § 230 prohibits states and municipalities from passing liability laws against internet companies covered under Section 230.
Why was Section 230 passed?
Section 230 was passed in 1996 while the internet was still in its infancy. Social media platforms did not yet exist, and fewer than 25 percent of the public used the internet at that time, according to a Pew Research poll.
In 2022, more than 63 percent of the global population uses the internet, and 59 percent of the global population uses social media, according to Statista.
Before the passage of Section 230, internet companies were liable for user-generated content if they made any effort to moderate it, according to an NPR article. Essentially, companies were penalized for attempting to keep the content safe and clean.
The effects of not monitoring content can be seen on networks like Reddit and Discord, where users regularly post pornographic, offensive, and hateful content without moderation.
Section 230 was created to facilitate internet growth and allow sites that publish user-generated content, referred to as “interactive computer services,” to moderate content without the fear of liability.
According to Congressman Chris Cox, a cowriter of Section 230, “The original purpose of this law was to help clean up the Internet, not to facilitate people doing bad things on the Internet.”
Congressman Cox has expressed support for updating Section 230, stating the assumption was that in exchange for liability protection, internet companies would responsibly police their platforms.
The Harm Caused by Section 230
Despite widespread internet changes, Section 230 has remained largely unchanged.
Social media companies such as Facebook and Snapchat created environments that allow users to post harmful content. They monetize that content and target users, despite the ability to promote less harmful content, because the harmful content results in more views and increases their profitability.
Whistleblower Francis Haugen publicized internal Facebook documents that proved Facebook knew it was causing harm but chose to keep doing it to enhance its profitability.
These platforms have taken advantage of Section 230’s intent, profiting from harmful content with total perceived immunity. Misconstruing the law in this way has threatened a generation globally with harm, by promoting content that enables the following:
- Terrorism and genocide
- White supremacy, homophobia, and other forms of bigotry
- The sale of deadly drugs to youth and children
- Low self-esteem
- Body-shaming and disordered eating
- Self-harm and suicide
- Grooming and sexual abuse
Section 230 was meant to provide a safe harbor for social media companies to police their platforms and reduce harmful content. However, positive content is less profitable than harmful content. Social media platforms have the means to protect users from harmful content, but doing so would adversely impact their massive profit margins. The results have been deadly and heartbreaking.
Social Media Addiction
Social media platforms promote addictive content to increase their profit margins, even while company research documents the harmful nature of the content regularly displayed to users.
Section 230 shields social media companies from liability for what users post. Social media companies created addictive algorithms for which they should be held liable.
Illegal drugs are regularly promoted and sold on social media platforms like Instagram and Snapchat. These drugs are often sold by cartels that substitute the drugs with deadly fentanyl without the buyer’s knowledge. Shockingly, social media algorithms help drug seekers and drug sellers find each other.
Suicide and Self-Harm
Suicide ideation and non-suicidal self-harm are promoted on social media in much the same way as drugs. TikTok currently faces lawsuits for the death of young children who attempted a stunt known as the blackout challenge, where they are encouraged to choke themselves until they lose consciousness.
Body Image and Eating Disorders
Social media sites like Instagram encourage teens to compare their bodies to celebrities and teens who have filtered their images to look thinner than they are. Content promoting eating disorders is rampant on social media, especially Instagram.
The result has been severe dissatisfaction with their bodies, obsession with body image, and eating disorders with lifelong physical and emotional effects.
Mental Health Disorders
Social media can cause mental health disorders in young people and exacerbate existing issues due to the following:
- Hate speech
- A perception that the lifestyles of others are better than theirs
- Fear of missing out
Section 230 Reform
Section 230 likely would have been written very differently if it were written today. Yet it has had very little meaningful reform since its passage more than 25 years ago. However, both former President Trump and President Biden have expressed support for repealing Section 230.
The only meaningful Section 230 reform to date is the Fight Online Sex Trafficking Act, which was passed into law in 2017. This law is known as FOSTA. It is sometimes referred to as SESTA, an earlier draft of the bill, which stands for Stop Enabling Sex Trafficking Act.
FOSTA amends Section 230 to exempt sexual exploitation of children, sex trafficking, and prostitution from liability protection. This was in response to websites such as Backpage making no effort to stop the sex trafficking of children and victims of forcible sex acts.
This law was passed with significant bipartisan support. While this law is not perfect, it removes some liability protection from social media companies, which a Texas Supreme Court ruling has affirmed.
Gonzalez vs. Google LLC
On 5/18/2023, the supreme court ruled in favor of tech companies after dismissing two cases, including Gonzalez v. Google, which dealt with content moderation on social media platforms. Social Media Victims Law Center Attorney, Glenn Draper, weighed in on the court’s decision:
“The decision from the Supreme Court, while certainly less than we had hoped it could be, was by and large positive for our clients. The decision suggests that the Supreme Court wants to address Section 230 but felt that the underlying facts and allegations in the Gonzalez case were not the right vehicle to do this. By expressly vacating the Ninth Circuit’s decision In Gonzalez, the Supreme Court rendered that decision a nullity, thus removing an adverse decision that would have been binding on Judge Gonzalez-Rogers in the federal MDL. In effect, the Supreme Court decision gives Judge Gonzalez-Rogers more leeway to decide the issue. The Supreme Court had other options available that could have left the Ninth Circuit ruling in place. For example, it could have dismissed the case saying certiorari was improvidently granted (“DIG”), which would have left the underlying decision intact. The fact that it didn’t suggests that the Supreme Court has more to say about Section 230.
I have seen some reporting claiming this is a “win” for the social media companies. We don’t see it like that.
While the Court did not take this opportunity to reign in Section 230, there isn’t anything in the case that suggests this is because they think Section 230 should be given the broad reach the social media companies want.”
Should Section 230 be repealed?
A full repeal of Section 230 would be an extreme approach that could cause harm to free speech on the internet. However, it does need to be revised.
Has your teen been harmed by social media?
The Social Media Victims Law Center is taking a stand against social media companies that would rather make a massive profit than do what is right for the young people they target.
If your teen has been harmed by social media, the company responsible should have to answer for that harm. Contact us today for a free consultation.
Frequently Asked Questions
For individuals and children who have been
We only handle cases on a contingent fee basis. This means that we are paid a portion of any recovery obtained in the case and you do not owe us any attorneys’ fees if the lawsuit does not result in a recovery.
Every case is unique. Our attorneys will work with your family to evaluate your potential case and help you evaluate whether filing a lawsuit or other legal proceeding is in your family’s best interest. Generally speaking, the types of cases we handle involve serious mental health effects, including attempted or completed suicide, eating disorders, inpatient mental health treatment, or sexual trafficking/exploitation that was caused by or contributed to through addictive or problematic social media use by teens and young adults.
We are a law firm based near Seattle, WA comprised of lawyers who have spent their entire careers representing victims who have been harmed by dangerous products. We are also parents. Shocked and troubled by the recent revelations about the harm caused to teens and young adults by social media platforms, which powerful technology companies have designed to be highly addictive, Social Media Victims Law Center was launched specifically to help families and children who have suffered serious mental harm or exploitation through social media use to obtain justice.
Matthew P. Bergman