big brother is watching

Everything Parents Should Know About Sora, OpenAI’s New “Brain Rot” App

Cybersecurity experts fear how children’s likenesses will be used against them if they try Sora.

by Katie McPherson
Teenager boy lying in bed using mobile phone
baona/E+/Getty Images

Brain rot. Slop. Pointless. AI-generated videos have been called many things by many experts, and it’s rarely anything good. OpenAI, the creators of ChatGPT, have launched a new app, Sora, that allows users to enter text prompts and create realistic videos, even using real people’s likenesses, including their own. The app is currently in early development and is invitation-only, but OpenAI has announced plans to roll Sora out to teens — and that has cybersecurity experts seriously worried.

Scary Mommy spoke with Ben Gillenwater, aka the Family IT Guy, to find out everything parents need to know about Sora.

What is Sora & how does it work?

Sora is a video-generation app that allows users to type in a prompt, and AI will create a video based on the description. Users can create Cameos, adding their own face into the videos, and remix others’ video creations. Like other social media apps, it features an endless feed you can scroll through to see other random users’ videos, as well as give likes, leave comments, and send DMs.

Sora has already begun receiving backlash from the families of deceased celebrities and historical figures, including the daughters of Martin Luther King Jr., Malcolm X, and Robin Williams. Sora has since blocked the use of King’s likeness after his estate brought attention to the racist ways users were using his face. Kristelia García, an intellectual property law professor at Georgetown Law, told NPR that OpenAI tends to ask forgiveness and not permission when it comes to using copyrighted material in general, and that “right-to-publicity and defamation laws vary by state and may not always apply to deepfakes.”

What are the risks of using Sora?

OpenAI is set to lose tens of billions of dollars every year and isn’t expected to become profitable until 2029, Gillenwater says. Video generation is the most expensive function for AI to perform. It matters that we ask why they’d give us — and our kids — access to this tool for free.

“Why would they operate at a loss and then give away a video generation tool? There’s always a trade. Nothing is free, so what are we trading?” Gillenwater says.

SoraAI is collecting our identifying data. This reduces users’ privacy for life, Gillenwater says.

There is so much about us that is unique and identifying, from our one-of-a-kind irises and retinas to the way we walk and how our voices sound. “Once you have recorded the unique attributes of every individual, the potential for public privacy goes way down,” Gillenwater says. This means that you could essentially be tracked anytime you move through the public where there are cameras — at stoplights, walking by residential doorbell cams, anywhere.

If that sounds outlandish and dystopian, well, Gillenwater says it’s already happening. Communities around the country are asking city leadership to remove AI-based license plate reader cameras from intersections after Border Patrol agents were found to be using the cameras to surveil the streets in Auburn, Washington. A Kansas police chief was caught tracking his ex-girlfriend using LPRs, and there have been multiple cases of innocent people held at gunpoint by law enforcement because these cameras’ AI system identified them as suspects in crimes they didn’t commit.

So, if you think only criminals should be worried about their public privacy, he encourages you to think again. “Corporate entities and government, if you give them data about your likeness, your movements, your conversations, who you talk with and who you have relationships with, where you go and where you don't go, and where you spend your money, that will be used against you.”

It is too soon to know how the data collected by Sora will be used, Gillenwater says, but we can theorize. Early adopters of Facebook who joined thinking they were just networking with friends have since learned all the ways Meta now mines and makes money off of our personal data. It’s not just about who owns the data now, but who will own it in five, 10, or 20 years, he notes. How might government regulations on how AI uses personal data tighten or relax in the coming decades? The safest bet is to just opt out, according to Gillenwater.

“When it comes to mass data, corporate and government entities operate on the basis of not should we do something, but can we do something. So can we, for example, analyze this huge data set that uniquely identifies every single person that uses our tools? Yes, of course. The data is there. So will they do it? I’d bet an enormous amount of money on yes. I see Sora as a trap.”

Users your child knows, and those they don’t, can DM them and use their likenesses as they wish.

In 2024, the National Center for Missing and Exploited Children received upwards of 456,000 reports of sextortion — when children under the age of 18 are being extorted using nude photos of them for money or real-life sexual favors. As Gillenwater recently learned from an Internet Crimes Against Children detective, roughly 100,000 of those cases involved nude photos generated by AI. In other words, someone took a regular photo of a child from somewhere online and used AI to create a nude photo of them.

“They just had a regular photo of them up on social media that was publicly accessible, and then the criminal or the predator can take that photo and then attach a nude body to it and then extort the kid with it,” he says. “If we look at Sora, we are providing video footage of our hyper-realistic likeness as a child. There’s no doubt in my mind that that is the dream of every predator on the planet. This is a silver platter that is like, here are children that you can victimize. Which one would you like to victimize today?”

There aren’t many laws in place yet to protect children from deepfakes, but some victims are taking their perpetrators to court. In many publicized cases, it is other children at the victim’s school who created the fake nudes as a method of bullying. Experts like Gillenwater fear what could happen next with incredibly realistic video generation technology, and classmates’ Cameos, available at the touch of a screen.

Sora is designed like TikTok with an endless scroll feed.

Apps like TikTok, Instagram, and Snapchat have bottomless feeds because they are designed to keep your attention and gather as much data as possible about the user, Gillenwater explains. Sora will be the same, and therefore another entry into the potentially addictive apps impacting teens’ mental health.

“These social media or social media-adjacent apps like Sora are in the business of capturing as much of your attention as possible. Because the more attention they get, the more they can understand you, and the better they can understand you, the better they can leverage you and manipulate you and sell you things,” he says.

How should parents talk about Sora with their kids?

This advice may seem counterintuitive, but Gillenwater advises that the first thing parents should do is download Sora and try it for themselves. Do not use the Cameo feature and give the app your likeness, he says, but get familiar with how other people are using it. Scroll through the security and privacy settings, see how easy it is to interact with strangers — everything. Then, get clear on your family’s values around social media use.

“When they say, ‘But why can’t I use it?’ Instead of being like, ‘Well, I saw a headline in the paper that said it was scary,’ I go, ‘Well, I saw it for myself, and our values don’t match its values because it’s trying to take our attention. It’s trying to connect us to strangers. It’s trying to convince us to give our likeness, our privacy,” he says. “If you’re a person that desires and values privacy, then it’s important to teach your kids about what that means so that they can start to develop critical thinking and develop a healthy skepticism for not just falling headfirst into a trap.”

When I ask Gillenwater how he might manage the risks of Sora in his own household, he said he’d probably begin the conversation like this: “Hey kids, I’m sorry, but on our home router and your tablets and stuff, I’ve blocked all of OpenAI’s services, so you can’t use them.”

As parents, we teach our children about stranger danger in the old “don’t talk to the guy with the white van at the park alone” way. But we need to be thinking about stranger danger in a more advanced way now.

“Have a more direct conversation about the principles involved in privacy and how dangerous it can be to assume that a person you don’t know online has your best interests in mind. That person could be somebody that you’re DMing with. That person could also be the engineer that wrote a piece of software that you’re about to use,” Gillenwater says. “When you engage in a system that has an online chat with strangers, you must have your guard up all the way, all the time. If somebody comes to you and they’re super friendly, unfortunately that’s a red flag. Sora brings that risk with it.”

With teens, you might also discuss the value of attention, he adds. “Attention is one of our most fundamental currencies. We can spend that currency by intentionally giving our attention to those around us and those that we care about.”

One of the most important things parents can do is model safe online behavior for their kids, Gillenwater says. How do you confront addictive algorithms and use your attention like currency? Where are you spending it? Do you prioritize privacy?

“You can demonstrate, ‘Here is why I have intentionally reduced my screen time on Instagram from three hours a day to two hours a day. I’m on my way to reducing that because our family values mental health, we value our attention, and we value each other. And if I spend one hour less on Instagram per day, then I have one hour more to spend with you.’ Those kind of conversations I think are really impactful, for the younger kids especially.”

The Parenting Advice You’ll Actually Use (Maybe)
Honest tips, hilarious fails, and solidarity for moms who are doing their best—and definitely winging it.
By subscribing to this BDG newsletter, you agree to our Terms of Service and Privacy Policy