Meta Platforms, the parent company of Instagram and Facebook, has come under fire recently for putting minors at risk. For instance, internal documents recently surfaced showing that Instagram knew its algorithms made girls feel worse about their bodies, yet it continued to promote diet culture. Meanwhile, a March 2022 Wired report showed the ease with which predators can find child sex content and target children to perform explicit acts.
This year, Meta announced its intent to increase efforts to protect minors. The social media giant released a series of safety features. The Social Media Victims Law Center has reviewed these measures and considered the plan’s benefits and shortfalls. Other outlets have also done the same, and many agree that the new safety features are helpful but insufficient for protecting children. While the new features may provide some safeguards for children with involved parents, more needs to happen to make Meta’s platforms genuinely safe.
What Are Meta's New Safety Features?
In June 2023, Meta announced a new set of teen safety features and parental controls.
On Messenger, parents and guardians can now do the following:
- View their teens’ screen time on Messenger
- Receive information about their teen’s contact list and privacy and safety settings
- See whether their teen allows messages and story views from friends, friends of friends, or no one
- Know when their teen reports someone if the teen shares that data
Meta has also made the following parental control additions on Instagram:
- After a teen blocks someone, they receive a suggestion to add a parent to their accounts.
- Parents can see how many friends a teen has in common with whom they follow and who follows them.
- Parents can choose what notifications they receive about their teen’s account.
These changes enhance the limited teen safety measures already in place. Teens will continue to receive safety notices when an adult in their direct message feed has been engaging in suspicious behavior, such as messaging multiple people under 18. Similarly, people over 19 can only message a teen if that teen follows them.
By adding to these safety features, Meta has improved parental control over the teen experience. Unfortunately, significant harm has already occurred.
The Catalyst: Instagram Pedophile Networks
Meta announced its new safety features less than a month after a shocking Wall Street Journal report revealed that Instagram’s algorithms promote the sharing of child-sex content. The platform’s recommendation systems “excel at linking those who share niche interests,” and unfortunately, that includes pedophiles and other predators.
The ease of finding child sex content is a dangerous issue Instagram has failed to mitigate. Wall Street Journal reporters found it relatively easy to locate content—both free and for sale—using explicit account descriptions and hashtags such as #preteensex.
Instagram’s personalization algorithm also actively promotes this type of content. The platform automatically recommends similar material when a user interacts with an account promoting child exploitation. Some posts claim to provide content directly from the child or offer “meet-ups” with minor children.
Instagram claims to be searching for new ways to fight child sex abuse, citing the removal of 27 pedophile networks over the past two years. Also, in response to the Journal’s investigation, the platform has blocked “thousands of hashtags that sexualize children” and restricted recommendations for explicit hashtags.
However, experts have found Instagram still contains high volumes of pedophilia accounts and child sex content. The proliferation of this content increases the risk to children and expands predators’ access to child victims. The Journal has quoted experts saying that current automated efforts are insufficient and Meta must invest in proactive human-led investigations.
Will the New Features Be Enough?
Meta’s new safety features supposedly add a layer of protection, but many experts doubt their effectiveness. The most important criticism targets Meta’s exclusive focus on parental controls.
Leaving Children in Care Behind
As one child safety officer told The Guardian, many of the most vulnerable young people don’t have an engaged parent. That category includes children in government care, who may be more susceptible to predators. In the U.S. alone, 60 percent of child sex trafficking victims have spent time in the foster care system.
Children in foster care are likelier to lack a parental figure who is available and willing to use available controls. If all of Meta’s child safety efforts rest on parental involvement, those children will fall through the cracks.
Relying on the Honor System
Meta makes it easy for young people to set up accounts without their parents’ knowledge. Its primary age verification strategy is to ask users for their birth date at signup.
If a young person can do basic arithmetic, they can falsify their age when signing up. Meta has begun to use artificial intelligence to screen for underage accounts, primarily by searching for posts that mention or allude to a user’s young age. However, as Meta explicitly admits, “The technology isn’t perfect.”
If a child does evade age controls and creates a Facebook account without their parents’ knowledge, Meta’s supervision strategy is useless. Any young person who doesn’t feel safe giving their parents access to their Facebook account or who doesn’t have an involved parent willing to provide supervision is at risk of significant harm online.
Demanding Meta Take Responsibility
By relying on parental controls for teen safety, Meta puts responsibility on the consumer for what is essentially a product issue. Parental controls will not stop predators from using their accounts to find child sex content. As experts have told the Wall Street Journal, there are likely hundreds of thousands of these accounts on Instagram alone.
Meta needs to step up its efforts to protect underage users. It needs safeguards and limitations on content discovery features so predators can’t connect and recommend content to one another. Currently, this is one of the most significant safety risks Instagram faces.
Meta must also implement better controls to prevent children from receiving explicit adult communications. The Social Media Victims Law Center has filed several lawsuits for families whose children communicated with predators on Instagram.
Meta has also allowed children to become addicted to their platforms. Social media addictions increase children’s exposure to harmful content, from predatory material to cyberbullying, and can lead to severe and long-term mental health issues.
If you believe Meta Platforms may have harmed your child, contact us today for a free consultation.