What is the Protecting Kids on Social Media Act?
The Protecting Kids on Social Media Act is a bipartisan bill introduced on April 26, 2023, by U.S. senators Brian Schatz, Tom Cotton, Chris Murphy, and Katie Britt. The bill would set the minimum age requirement to use social media apps to 13 years old, require children under 18 to obtain parental consent before using a platform, and ban the use of algorithms on users younger than 18 years old.
Written and edited by our team of expert legal content writers and reviewed and approved by Attorney Matthew Bergman
- Content last updated on:
- December 5, 2023
Written and edited by our team of expert legal content writers and reviewed and approved by
- Content last updated on:
- December 5, 2023
Social media platforms have come under fire in recent years for their harmful effects on users, particularly those under 18. Numerous studies have been released connecting social media use to depression, anxiety, eating disorders, sleeping problems, and suicide in teens.
The Protecting Kids on Social Media Act is the latest congressional attempt to find a solution. It aims to set an age requirement for social media use, establish parental consent requirements, and ban the use of algorithms on users younger than 18 years old. Contact the Social Media Victims Law Center for more information about the dangers of social media.
What Are the Key Provisions of the Protecting Kids on Social Media Act?
If passed as legislation, this act would protect children and adolescents by mandating that social media companies establish safety measures such as age requirements and parental consent to control who uses their platforms. It also establishes a pilot program to verify users’ identification and lists penalties for a social media company’s noncompliance.
The key provisions of the Protecting Kids on Social Media Act include:
- Minimum Age Requirement for Social Media Use
- Parental Consent Requirements
- Mandatory Safety Features on Social Media Platforms
- Penalties for Non-compliance with the Act
Minimum Age Requirement for Social Media Use
The Protecting Kids on Social Media Act establishes 13 as the minimum age to use social media. This aligns with many social media platforms’ existing policies requiring users to be at least 13 years old. Companies like Twitter, Facebook, Instagram, and TikTok already have this requirement or have a view-only experience for users under 13.
Parental Consent Requirements
Another feature is parental consent. The bill requires a parent or guardian to give permission for users under 18 to create a social media account. Thus, social media users from 13 to 17 would need parental consent to make an account.
Mandatory Safety Features on Social Media Platforms
Another major feature of the legislation bans social media platforms from using algorithms on users under 18. Social media companies use algorithms to track users’ activities and target ads and content to their preferences. Multiple studies indicate this has a particularly devastating impact on youth and teens by fostering social media addiction and exposing them to harmful content.
The act also establishes a pilot program to verify that users meet the age requirement and verify parental consent. The verification will be done by uploading copies of the following information:
- Government-issued documents
- Documents issued by an educational institution
- State DMV records
- IRS records
- SSA records
- Other government or professional records
This program is voluntary, and the information used to verify age and parental consent cannot be used for any other reason or forwarded to any law enforcement agency.
Penalties for Non-compliance with the Act
If a social media company is found to have violated this act, the civil penalty is calculated by multiplying an amount not to exceed $10,000 by the greater of (1) the number of days the social media company was not in compliance with the act or (2) the number of users who were harmed by the violation.
Importance of the Bill
The world differs from decades ago when the Internet was still a novelty. Children, adolescents, and teenagers no longer interact and socialize at the mall or the arcade. They interact online through social media platforms.
The following information collected by the Pew Research Center shows the use of social media among American teenagers:
Social Media Platform
Percentage of U.S. Teens Who Use the Platform
YouTube
95 percent
TikTok
67 percent
62 percent
Snapchat
59 percent
32 percent
23 percent
Because children and teenagers are consistently on social media, laws are needed to protect them from harm.
Risks and Dangers of Social Media for Minors
Social media is dangerous to minors because a child’s mind is more easily influenced than an adult’s. The brain is still developing as a child or teenager, particularly during puberty. During this time, young minds start to operate based on social rewards. American Psychological Association Chief Science Officer Mitch Prinstein has said children cannot restrain themselves from using social media since they crave social attention.
A like on a post, an upvote on a video, or constant scrolling to refresh a page triggers the social reward system in the brain, releasing dopamine, the “happy hormone,” ensuring the user stays on the platform.
Studies have shown the following dangers of social media on teen mental health:
- 32 percent of teenage girls felt that Instagram made their insecurity about their bodies worse
- Social media contributes to sleep problems in adolescents aged 13 to 15 years old
- A correlation between eating disorders and social media use in adolescents
- A connection between social media use and an increase in self-harm and suicide among adolescents
Impact of Cyberbullying, Harassment, and Exploitation on Children
Cyberbullying, harassment, and exploitation of children plague social media as well. The American Academy of Child and Adolescent Psychiatry lists several risks and dangers of social media, including exposure to unsafe content such as sex, drugs, violence, cyberbullying, and exposure to dangerous persons.
The Pew Research Center found that almost 60 percent of U.S. teens have been bullied or harassed online. And the Federal Bureau of Investigation has estimated that there are half a million predators online every day with multiple online profiles. Over half of the victims are aged 12 to 15, and almost 90 percent of them are contacted through chat rooms and instant messaging.
Accountability on the Part of Social Media Companies
The consensus of lawmakers, psychologists, and the general public is that there is a causal link between the growing teen mental health crisis and the increased use of social media. Thus, social media must become a safer environment for adolescents and teenagers.
This push for change and the need for greater accountability on the part of social media companies erupted after the bombshell statements made by Facebook whistleblower Frances Haugen. She revealed that Facebook put profits over Internet safety. She said the social media giant knew its platform had harmful content but failed to address it. Facebook allegedly recognized that harmful content led to more users, which meant more advertisements.
Furthermore, leaked internal studies found that 13.5 percent of teen girls stated that Instagram made their suicidal thoughts worse, and 17 percent said that Instagram made their eating disorders worse.
Now, federal and state governments are looking to make social media safe for young users. Besides the Protecting Kids on Social Media Act, several other pending bills attempt to make the Internet safer for children and teenagers, such as the Social Media Child Protection Act and the Children’s Online Privacy Protection Act 2.0 (COPPA 2.0). Some states like Utah, Montana, and Arkansas have also begun introducing legislation targeting social media companies.
Even parents are looking to hold social media companies accountable. There has been a massive increase in civil lawsuits against social media companies. In March 2023, many of these civil cases were consolidated into multidistrict litigation. Defendant social media companies include Meta, the owner of Facebook and Instagram; Google and Alphabet, who own YouTube; Snap, which owns Snapchat; and ByteDance, which owns TikTok.
What Is the Status of the Act in Congress?
This act was introduced at the end of April 2023. It needs to gain the support of the Senate Committee on Commerce, Science, and Transportation before moving to the Senate floor for a vote. If passed in the Senate, the House of Representatives would also have to pass it, and the president would have to sign it.
Public Opinion on the Act
This act aims to protect children and teenagers from the dangers of social media. Senator Cotton said the age and parental consent requirements would “put parents back in control of what their kids experience online.”
However, this has received pushback, particularly the notion of parental consent. Critics are concerned that teens without an encouraging family could be alone without access to social media. This includes LGBTQ+ children whose parents do not support them. Alternatively, children could pressure their parents into giving their consent. Critics see the law as cutting teenagers off from vital communities or exposing them to harmful content, such as endless scrolling.
Founder and CEO of Common Sense Media James P. Steyer stressed that Congress should place the burden on social media companies to make the Internet a safe platform for children and teenagers.
Contact Us for More Information
For more information about the negative effects of social media on your child and how we can help, contact the Social Media Victims Law Center.
Frequently Asked Questions
For individuals and children who have been
We only handle cases on a contingent fee basis. This means that we are paid a portion of any recovery obtained in the case and you do not owe us any attorneys’ fees if the lawsuit does not result in a recovery.
Every case is unique. Our attorneys will work with your family to evaluate your potential case and help you evaluate whether filing a lawsuit or other legal proceeding is in your family’s best interest. Generally speaking, the types of cases we handle involve serious mental health effects, including attempted or completed suicide, eating disorders, inpatient mental health treatment, or sexual trafficking/exploitation that was caused by or contributed to through addictive or problematic social media use by teens and young adults.
We are a law firm based near Seattle, WA comprised of lawyers who have spent their entire careers representing victims who have been harmed by dangerous products. We are also parents. Shocked and troubled by the recent revelations about the harm caused to teens and young adults by social media platforms, which powerful technology companies have designed to be highly addictive, Social Media Victims Law Center was launched specifically to help families and children who have suffered serious mental harm or exploitation through social media use to obtain justice.
Contact Us Today