Instagram Promotes Accounts Sharing Child Sex Abuse Material

a person holding a smartphone with Instagram open

According to recent investigations, Instagram’s recommendation algorithms have promoted accounts sharing child sexual abuse content. Researchers have determined that Instagram has a particularly severe problem with accounts showing self-generated child sexual abuse material due to platform hashtags, its recommendation algorithm, and the relatively long life of seller accounts. The Social Media Victims Law Center offers information about how to avoid pedophiles on social media platforms such as Instagram.

According to investigations by The Wall Street Journal and researchers at the University of Massachusetts Amherst and Stanford University, Meta’s Instagram social media platform hosts a vast network of accounts sharing and selling underaged-sex content. Unlike the file-transfer services and forums that pedophiles have traditionally used to share illicit content, Instagram doesn’t just host these pictures—its algorithms amplify this content by connecting pedophiles and guiding them to illicit content sellers.

Here is what families need to know about the investigations, their key findings, Instagram’s response and responsibilities, and the impact of Instagram’s algorithms on users and society.

How Researchers Conducted the Investigations

To research pedophiles on Instagram, The Wall Street Journal consulted with academic experts on online child safety at the following institutions:

  • Stanford’s Internet Observatory, a division of the university’s Cyber Policy Center focusing on social media abuse, published an independent analysis of Instagram features that helped users connect and find content.

  • University of Massachusetts’ Rescue Lab, which focuses on driving research and creating tools to rescue children from internet-based victimization, analyzed how pedophiles on Instagram fit into the larger world of virtual child exploitation.

Both entities quickly spotted large-scale communities promoting child sexual abuse. After creating test accounts that viewed a single account in the pedophile network, the researchers instantly received “suggested for you” recommendations for child sex abuse content buyers and sellers and accounts linking to off-platform content trading sites. If a test account followed a handful of these recommendations, it would receive a flood of underage sexual content.

The Investigations' Key Findings

The investigations revealed many disturbing details about pedophiles on Instagram. Here are the key findings:

  • Pedophiles on Instagram are open about their interests. They use Instagram’s hashtags, such as #preteensex and #pedowhore, to advertise and find child-sex material.

  • Many of the child sex abuse accounts are purportedly run by children. They also use overtly sexual handles and phrases such as “little slut for you.”

  • Instagram accounts selling child sex abuse material usually do not publish the content openly. Instead, they post menus and invite buyers to commission specific sexual acts, including children performing sexual acts with animals. Some menus also allow clients to order children for face-to-face meetings.

  • Besides underage sex content creators and buyers, other accounts in the Instagram pedophile community discuss their access to children and collect pro-pedophilia memes. Former and current Meta employees who have worked on Instagram child-safety initiatives believe there are hundreds of thousands to millions of Instagram accounts created to view, buy, and sell child sex abuse content.

  • Instagram’s parent company, Meta, has struggled to suppress and discourage the discovery of pedophilic materials due to failing to oversee keywords and other design features that promote content discovery of illicit and legal material. The Stanford research team discovered 128 accounts offering to sell child sex abuse material on Twitter, which is less than a third of the number found on Instagram. Twitter also did not recommend pedophilic accounts to the same degree as Instagram and took down the accounts much quicker.

  • Instagram permitted users to look for terms that its algorithms know may be associated with child sexual abuse content. For example, if a user tapped on a keyword associated with child sex abuse, they would receive a pop-up telling them that the images in the keyword may contain child sexual abuse. The pop-up also warned that creating and consuming such content causes “extreme harm” to children. Despite the warning, users could choose to get resources about child sex abuse or “see results anyway.” Instagram removed the option for users to view search results for keywords likely to produce illicit content after The Wall Street Journal contacted it with questions.

  • Accounts selling underage-sex content do not always stay gone. According to its internal guidelines, Instagram imposes penalties for violating its community standards on accounts, not devices or users. Since Instagram lets users have multiple linked accounts, pedophiles can easily evade enforcement. Pedophiles on Instagram often list their backup account handles in their bios, allowing them to continue sharing content with the same audience if Instagram removes their original account.

Instagram's Response and Responsibilities

In response to the investigations, Meta acknowledged it had problems enforcing operations. It also revealed that it had done the following to remove underage sexual content on Instagram:

  • Created an internal task force
  • Taken down 27 pedophile networks over the past two years
  • Worked on preventing Instagram’s systems from recommending pedophilic content and connecting pedophiles

According to Alex Stamos, the head of the Stanford Internet Observatory and Meta’s Chief Security Officer until 2018, it would take a sustained effort to get obvious child abuse under control. Currently, Meta’s automated screening for existing child exploitation cannot detect new images and videos or efforts to sell them. To prevent and detect such activity, Meta must track and disrupt pedophile networks and make it hard for them to connect with each other, recruit victims, and find content.

Meta is responsible for disrupting pedophile networks because law enforcement agencies do not have the resources to investigate more than a small fraction of the tips the National Center for Missing & Exploited Children receives.

The Impact of Instagram's Child Sex Abuse-Promoting Algorithms on Users and Society

According to the Wall Street Journal investigations, Instagram’s internal statistics reveal that users see child exploitation in less than one in 10,000 posts viewed.

Although the percentage of illicit posts seems low, it is not low enough. According to Statista, eight percent of Instagram users are 13 to 17. Additionally, according to The Common Sense Census : Media Use by Tweens and Teens, 2021, children aged eight to 12 spend an average of five and a half hours in front of screens, and teens aged 13 to 18 spend an average of 8 and a half hours on screens. Because children and teens spend so much time on social media, their chances of encountering illicit child sex abuse content and pedophiles are not that low.

Allowing child sex abuse materials and pedophiles on Instagram can severely affect teens, children, and society. Some of these effects include:

  • Trauma and psychological harm: Exposure to abusive and explicit content can cause lasting psychological distress and trauma for children and teen viewers.

  • Normalizing abusive behavior: Easy access to child sex abuse materials may normalize child sex abuse, increasing the risk of tolerating or perpetuating these harmful actions.

  • Exploitation and victimization: Instagram’s algorithms make it easier for pedophiles to find, buy, and sell child sex abuse materials, fueling the demand for them. This perpetuates the victimization and exploitation of vulnerable minors.

  • Grooming and predatory behavior: Pedophiles may attempt to groom minors and initiate abusive relationships to create child sex abuse content.

  • Child safety concerns: Parents and guardians may become more anxious about their children’s online activities. They may install parental control software and prevent their children from using platforms like Instagram. This may reduce positive online experiences, such as keeping

Contact Social Media Victims Law Center For a Free Confidential Case Evaluation

The Social Media Victims Law Center works to hold social media companies legally accountable for the harm they inflict on vulnerable users. We apply principles of product liability to force social media companies to design safer platforms and protect users from foreseeable harm.

If your child was traumatized by child sex abuse material or groomed by a pedophile on Instagram, contact Social Media Victims Law Center. We will sit down with you and your child to determine the best course of action. Depending on your case, you may be eligible to file a lawsuit against Instagram for exposing your child to harm.

Fill out our online form or call (206) 741-4862 for a free, confidential case evaluation with Social Media Victims Law Center.

Contact Us Today

Contact Us Today

Top Post