scary news

A New Report Suggests Instagram Has Become A Key Hub For Pedophile Networks

The findings are shocking and grotesque.

Updated: 
Originally Published: 
A new report shows that pedophiles are using Instagram to sell and trade content.
Crispin la valiente/Moment/Getty Images

A report coming from The Wall Street Journal has revealed that Instagram, the popular social media app owned by Meta, has become a key facilitator in connecting and promoting a large network of accounts “openly devoted to the commission and purchase of underage-sex content.”

The report should be a horrifying wake up call for parents in more ways than one — not only does the selling or trading or child pornography hurt everyone, but related hashtags and imperfect artifical intelligence means that your kids could stumble upon content while surfing social media. Especially if you’re not monitoring your kids or hoping that Instagram keeps your kids safe for you.

“Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn’t merely host these activities. Its algorithms promote them,” the outlet reported. “Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests.”

The report also claims that Instagram facilitates the promotion of accounts that sell illicit images via “menus” of content for pedophiles to peruse like a person dining at a restaurant.

The report continues, “Certain accounts invite buyers to commission specific acts. Some menus include prices for videos of children harming themselves and ‘imagery of the minor performing sexual acts with animals’, researchers at the Stanford Internet Observatory found. At the right price, children are available for in-person ‘meet ups’.”

So, how does something like this even happen?

The report calls on Meta’s dependence on automated detection tools as a key reason for why this occurs — because AI and robots don’t know any better (for now). The WSJ also highlighted Instagram’s algorithms, which essentially blindly promote more harmful content to interested users through the use of related hashtags such as (trigger warning) #pedow**re and #preteensex.

Confusingly, when these hashtags appear, Instagram has a warning pop-up, but won’t do anything to remove the app’s ability to aggregate content from such disgusting search terms all together. Make it make sense!

To give Meta some credit, the Community Standards violations show an uptick in enforcement actions in this area of late. Over 8 million cases of child endangerment were acted upon in 2023, according to Meta reporting.

In response to questions raised from this report, Meta acknowledged problems within its enforcement operations and said it has set up an “internal task force” to address the issues raised.

“Child exploitation is a horrific crime,” the company said, adding, “We’re continuously investigating ways to actively defend against this behavior.”

This new report is just another deeply troubling reason why social media is such a minefield for children. In addition to harmful content, it can also serve kids misleading information, show them unrealistic pictures, encourage eating disorders, or bolster bullying.

The American Psychological Association (APA) even released a new set of guidelines on social media use for kids last month, urging parents to limit and monitor the content their children are exposed to on apps like TikTok, Snapchat, Facebook, and, yes, Instagram.

The guidelines include ten recommendations to help parents make sure their kids develop healthy social media habits that do not hinder their physical, mental, and emotional health. One of the main themes of the guidelines surrounds parent involvement and social media literacy.

If parents keep up with the latest social media news and trends, as well as monitoring their kid’s social media usage, the chances for a child to have a balanced, safe, and meaningful social media relationship grow exponentially. However, companies like Meta also need to set up and stop dangerous and illegal content from ever appearing on their platforms.

Read the entire WSJ report here.

This article was originally published on