Safer Scrolling

Instagram Just Unrolled New Kid Safety Measures

Instagram will automatically send new users under the age of 16 to a cleaner version of the platform. Media experts don’t think it’s enough.

Teen on her phone looking at social media. Instagram just unrolled a new feature that limits how muc...
The Good Brigade/DigitalVision/Getty Images

Instagram has updated its Sensitive Content Control feature to default users under the age of 16 to the “less” state, making it more difficult for teens and tweens scrolling to come across potentially sensitive content. This feature works in any part of the app, be it search, the Explore page, hashtags, reels, feed recommendations, and suggested accounts. Accounts with users over the age of 18 are set at “standard,” and anyone over 18 has the additional option of “more.”

In addition to this automation, Instagram is “testing a new way to encourage teens to update their safety and privacy settings” that prompts teens to review their privacy settings including who can share their content, who can contact them, and time management tools.

Instagram

“The safety measures for minors implemented by Instagram today are a step in the right direction that, after much delay, start to address the harms to teens from algorithmic amplification. Defaulting young users to a safer version of the platform is a substantial move that could help lessen the amount of harmful content teens see on their feeds,” said Jim Steyer, Founder and CEO of Common Sense Media, the nation’s leading nonprofit organization dedicated to helping parents and kids successfully navigate our tech and media-saturated world.

“However, the efforts to create a safer platform for young users are more complicated than this one step and more needs to be done,” he added. Media experts like Steyer do not feel Instagram’s updated Sensitive Content Control feature is enough to protect young people from the negative and harmful effects of social media, which countless studies have proven is very real.

"For years, teens have been exposed to drugs, eating disorders, and violent and sexual content on Instagram all while being nudged by algorithms to continue consuming that dangerous content,” Steyer continued in the press statement.

Instagram

Meta — which owns Facebook, Instagram, and WhatsApp — knew that the algorithm and push notification methods used could easily addict younger users to the platform, feeding them harmful material. They continued to use the algo and push tactics until they were called out by whistleblowers. Some states, like California, are even attempting to sue the platform (along with TikTok) for their addictive marketing tactics.

“Instagram should completely block harmful and inappropriate posts from teens' profiles. Instagram should also route users to this platform version if they suspect the user is under 16, even though they indicated at sign-up they are older. The platform's list of what types of content and accounts it considers sensitive should explicitly include content that promotes harmful behaviors like disordered eating and self-harm,” said Steyer.

"With the increasing mental health crisis among youth in the country, social media companies like Instagram and others need to be proactive in putting all the necessary safeguards in place to ensure teens are protected on their platforms instead of waiting to be reined in by legislation. The new measures are welcome, but alone they are insufficient to make the site a safer and healthier place for all young users,” the Common Sense Media SEO concluded.

Two weeks ago, Instagram also made the option available to block weight loss ads on Instagram. In addition to this topic, users can remove sensitive content that depict violence, might be sexual in nature, or is a targeted ad for triggering topics like tobacco, vaping, or cosmetic procedures. View the full list and how to set your Instagram sensitive content control features to your liking here.