California Court of Appeals Allows Snapchat Fentanyl Cases to Move Forward

Online Safety Bill

The United Kingdom is considering legislation that, if passed, would fundamentally change how the internet and social media can be used by children and young adults. It would also open the door for civil litigation if children are able to access harmful or illegal material.

Written and edited by our team of expert legal content writers and reviewed and approved by Attorney Matthew Bergman

Written and edited by our team of expert legal content writers and reviewed and approved by

a photo of Matthew Bergman

Given that the internet is a widely unregulated space, it’s easy for children and young adults to access information and content that can be detrimental to their mental health and well-being.

In light of recent events, including a surge in teenage depression, self-harm, and suicide, the United Kingdom is taking action with the introduction of the Online Safety Bill to make the internet a safer place for its youth.

What is the Online Safety Bill?

The Online Safety Bill, initially introduced in Parliament in May 2021, strives to fundamentally change the internet for young Brits.

The bill would empower Ofcom, the United Kingdom’s communication regulator, to police social media platforms or search engines.

It would also impose a duty of care on tech companies to regulate illegal and potentially harmful or damaging content to reduce visibility and access, especially for young users.

The statutory regulation would apply to any “user-to-user services” and “search services,” such as:

  • TechUK members (including over 25,000 tech companies)
  • Social media companies like Facebook and Instagram (owned by Meta), Twitter, Pinterest, Snapchat, Discord, and TikTok
  • Gaming platforms
  • Search engines like Google and Bing

Under the proposed law, the companies could have to take proactive steps to enhance internet safety for kids by:

  • Putting safeguards in place to monitor and remove illegal content and legal but harmful content
  • More closely monitoring and restricting access to children under the age of 13

Illegal content would include images and videos of child pornography, child abuse, and sexual abuse, while “legal but harmful content” might include content involving:

What’s categorized as “legal but harmful” will likely be the source of great discussion, as the phrasing is rather broad and subjective. Critics worry that regulating legal content would open the floodgates for censorship, which has become increasingly problematic in recent years.

Safety Blocks

Why do we need the Online Safety Bill?

At its core, the Online Safety Bill just wants to keep damaging and harmful content away from impressionable children and young adults.

Social media use among children increased by 17 percent between 2019 and 2021. A report by the New York Times revealed that children between the ages of 8 and 12 spend about five and a half hours online every day, while full-fledged teenagers log about eight and a half hours online daily.

While there are some safeguards in place, children and teens essentially have the world at their fingertips when they’re online. Young kids are smart and tech-savvy, so it’s not difficult for them to find their way to content that their parents or reasonable adults wouldn’t approve of.

At the end of the day, it’s far too easy for kids to access content that could exacerbate a mental health issue, aggravate an eating disorder, or introduce them to information that they’re simply not mature enough to handle.

Social media companies are complicit in this problem, too. They know what users are accessing, and algorithms direct even more of that content in their direction, encouraging social media addiction. So, a child struggling with depression who’s looking at pictures of self-harm or reading about suicide might find themselves with a deluge of posts of that nature that can accelerate their battle with the disease.

Molly Russell

How has Molly Russell influenced the Online Safety Bill?

In 2017, Molly Russell, a seemingly healthy and happy young girl, took her own life. An inquest into her tragic death revealed that she spent hours online and binged “life-sucking” content that the coroner in her case believes played a considerable role.

According to coroner Andrew Walker, “The content [she viewed] was particularly graphic, tending to portray self-harm and suicide as an inevitable consequence of a condition that could not be recovered from.”

He also referenced the social media algorithms that likely pushed even more harmful content her way.

After reviewing Molly’s case, Mr. Walker had two suggestions for Parliament as it considers the bill:

  • Create separate social media spaces for children and adults.
  • Create stronger age verification processes to ensure that children can’t access harmful information.

The Molly Russell inquest has sparked a lot of conversation about the Online Safety Bill. As the bill makes a resurgence, proponents and critics alike are eager to have their opinions on the matter heard.

Why is there pushback for the Online Safety Bill?

The Online Safety Bill would make fundamental changes to the way the internet is used in the United Kingdom.

There are three primary critics:

  • Social media companies and technology giants that do not want to be regulated or subject to civil litigation
  • Individuals and corporations concerned about data privacy
  • Individuals and groups that are concerned about government overreach, UK freedom of speech, and censorship

The pushback from technology companies like Meta is understandable. The law would create a legal duty of care that could open the door to thousands of lawsuits from families of children who experience adverse mental health effects after accessing their online platforms.

Those concerned about censorship are particularly worried about how “legal but harmful” content would be defined. Some claim that technology companies would “err on the side of censorship” to avoid legal problems and, in turn, create a “disaster for free speech.”

When will the Online Safety Bill be effective?

The Online Safety Bill was reintroduced in Parliament in March 2022. According to then Prime Minister Liz Truss, the bill is expected to move through the legislative body after the House of Commons summer recess comes to a close. Once it passes the House of Commons, it will be heard in the House of Lords.

The bill may change as it moves through Parliament but is expected to pass and become law in early 2023.

Even after the Online Safety Bill becomes law, there will be several issues requiring clarification, including what can be designated as legal but harmful content and, therefore, regulated.

Once the Online Safety Bill becomes law, families in the United Kingdom may have a clearer path to legal action against social media giants when their children fall victim to hazardous, harmful content online.

Matthew Bergman
Content Reviewed by:

Matthew P. Bergman

Add Your Heading Text Here

Contact Us Today

Related Pages

Client Testimonials

Sue H.
People who are taking care of loved ones who are traumatized can’t stop thinking what they are going through. One reason it was so invaluable we had Matt in our lives is I was so completely distraught over what was happening that it was wonderful to know that someone in the background was making certain that things were being taking care of. I will always be grateful for that..
Richard M.
I would like to thank you and your staff for your continued efforts and support in assisting our family. . . Your firms’ relentless pursuit for justice is greatly appreciated.