A Roomba Took Video Of A Woman Using The Bathroom — And Screenshots Ended Up Online
The leaked screenshots have many wondering just how much data their smart devices collect and how it is used.
Roombas, or any other robot vacuum for that matter, can be a lifesaver for parents who have too much to juggle and need a little assistance cleaning up the house. But for every technological convenience and “lifesaver” granted to us, there seems to be increasingly insidious tradeoffs when it comes to allowing smart devices to monitor and collect data in multiple ways.
Case in point? Photos of a woman using a toilet — captured by a test model Roomba that uses video as a way to enhance the robot’s ability to navigate around different floor plans and obstacles — were leaked onto private Facebook groups.
The test model was designed to take video as it cleaned, and then send said video to Scale AI, a startup that contracts workers globally to assist in labeling audio, photo, and video data (known as data annotation) to help train artificial intelligence.
“How do our robots get so smart? It starts during the development process, and as part of that, through the collection of data to train machine learning algorithms,” Colin Angle, CEO, Chairman and Founder of iRobot, explained in a LinkedIn post. “Collecting this data enables us to build the intelligence inside our products that powers features like object recognition and avoidance, room identification and customized cleaning suggestions.”
Ostensibly, this private data is supposed to be used for training only, but MIT’s Technology Review obtained 15 screenshots that had been posted in private Facebook groups. There were photos of kids laying on the floor, giving the robot vacuum a silly stare, along with the photo of a woman using the toilet. And while her face is blurred out on the main photo, there are others that fully show her in a wild violation of privacy.
While these photos were obtained from test vacuums that Roomba employees and others consented to using in their households, the leaked photos are a stark reminder of how quickly many people sign off on having their data monitored with smart devices — and how that data is ultimately tracked by humans who could leak or use the data for other purposes other than how it was intended.
“When you have data that you’ve gotten from customers, it would normally reside in a database with access protection,” Pete Warden, a leading computer vision researcher and a PhD student at Stanford University, told Technology Review.
With machine-learning training, like that used in the test unit Roombas, customer data is all combined “in a big batch,” widening the “circle of people” who get access to it.
In order for the AI to learn, data annotation services claim they need access to human faces. Both the faces of adults and minors are not considered “sensitive” forms of data and are often sent to data centers without being censored. And, as some people have learned, this data can be sold to third parties and used in other ways, like when a woman was denied entrance to Radio City Hall with her daughter’s Girl Scout troop due to facial recognition.
With Christmas just days away, now might be a good time for parents to talk to their kids about data safety and digital privacy, especially if Santa is bringing the kiddos any smart devices. And in general, read the fine print when agreeing to share you data with companies.