YouTube Kids Is Releasing An App Moderated By Actual Human Beings
This ‘Whitelisted’ version will add more safety measures to their video content
We may think our kids are relatively safe watching YouTube Kids, but news coverage over the past year has brought to light the disturbing, often inappropriate content still available on it. The video sharing website is finally looking to address this by having actual people, instead of algorithms, determine what’s appropriate for children.
According to BuzzFeed News, an updated version of video offerings selected by YouTube curators will now be an option for parents wanting to guard what their kids view a little more closely. This “whitelisted” version of moderated content as well as their current algorithmic version will both be available, offering parents a choice in how their kids access the app’s content. Though I’m not sure it’s really a choice when the algorithmic version offers kids options that include explicit sexual language, profanity-laced parodies like one of the film Casino featuring Bert and Ernie, adult content about pornography and child suicide, and jokes about drug use, Polygon reported.
The new option, which should be available within a couple of weeks, gives parents a bit more peace of mind when it comes to what our kids are doing online. While we might think we know what our kids are watching, it only takes a second to click from one video to another and unless we’re sitting next to them 100 percent of the time (which is an impossibility), odds are they’ve come across inappropriate content already.
I’ve walked into our living room on many occasions and heard a swear word or violent content coming from our family iPad. Our kids know what they’re doing is wrong, but like kids do, they’re testing boundaries and trying to figure out how much they can get away with. While we, like many parents, feel safe by putting age restrictions, “restricted mode,” and setting parental controls on electronic devices, it’s simply not enough.
YouTube CEO Susan Wojcicki said they’d be adding more than 10,000 moderation corps in 2018 to help combat “content that might violate our policies,” Polygon reported.
While this whitelisted option and human moderation are a good start, according to Gizmodo, it may not be enough. “It won’t address other issues with YouTube Kids like consistent accusations the whole operation essentially cashes in on lapses in regulatory oversight to expose children to ad-deluged content that would run into Federal Trade Commission trouble on, say, TV,” they said.
The bottom line for parents is no matter what tech is available to make the internet safe, we still need to continuously be informed of what’s available to our kids online and be vigilant in monitoring what they’re watching as much as possible.
This article was originally published on