Parenting

Facebook's New Features Aim To Make Its Apps Safer For Teens

Reggie Casagrande/Getty

In the wake of internal company research revealing just how damaging Facebook and Instagram are for teens, they’re now implementing new safety features

You probably don’t need us to tell you the many ways in which social media can negatively impact the mental health of teens and young adults. But in recent weeks, concrete evidence from Facebook itself — including a former employee coming forward to confirm the worst of it, sharing internal research with the Wall Street Journal — shows that Facebook, along with Instagram, are actively harming their youngest users’ body image, self-esteem, and mental health, and it looks like they’re making small steps to help protect teen users.

Facebook, which also owns Instagram, announced that it’s implementing new tools on its platforms aimed at keeping teens safer, including prompts to encourage teens to take a break when screen time on Instagram hits a certain threshold, as well as a “nudging” feature to help steer them from potentially damaging content, according to the company’s vice president of global affairs, Nick Clegg. Clegg appeared on several news programs to announce the company’s plans Sunday.

The company is also planning to introduce new optional parental controls so that parents or guardians can monitor what their tweens and teens are doing online. Clegg declined to share any specifics, such as what qualifies as harmful content or how customizable the new safety features will be. But parental controls have existed as long as the internet has, and it’s a safe bet to worry that tweens and teens will be able to outsmart these controls and spend ample time on the apps anyway.

Thankfully, the much-maligned Instagram for Kids remains on pause, but it’s unclear if the company itself is doing anything to weed out content that could impact young users — nudging them to step away or look at something else is all well and good, but it doesn’t actually change the fabric of the apps, which essentially create an onslaught of harmful content and what seems like zero regulation on the part of Facebook and Instagram to remove it before kids see it.

“We are constantly iterating in order to improve our products,” Clegg told Dana Bash on CNN‘s State of the Union. “We cannot, with a wave of the wand, make everyone’s life perfect. What we can do is improve our products, so that our products are as safe and as enjoyable to use.”

Of course, it’s a start, but the effects of these apps on young users around the globe are so far-reaching that it feels like too little too late. There’s simply too much at stake when it comes to the well-being and safety of kids.