Ashton Kutcher’s organization dedicated to protecting children from sexual exploitation — Thorn — was recently granted a portion of $280 million through The Audacious Project at TED to help fund its fight to eradicate child sexual abuse materials (CSAM) from the internet.
In 2016, Thorn, an organization co-founded by actor and tech-developer Ashton Kutcher and actress Demi Moore, developed Spotlight, a software used to help identify victims of child sex trafficking in the United States. With Spotlight’s software stretching into all 50 states and Canada, Thorn has already helped to identify more than 9,000 children.
There are more than 150,000 escort ads posted online everyday, and somewhere in the midst of that wreckage, traffickers are selling children.
According to a survey conducted by Thorn, three out of four trafficking survivors were advertised online. With the use of Spotlight, law enforcement is identifying over eight kids per day on average, with victims identified in 83% of cases with a 63% reduction in critical search time.
As part of this initiative to eliminate child sexual abuse material from the internet, Thorn has built a
new product called Safer, a low-cost, scalable tool required for swift review, removal, and reporting of
child sexual abuse material, and aims to end its viral distribution.
Some small- and mid-sized companies are hesitant about bringing this new software on due to lack of resources and general awareness about the dangers of allowing user-generated content, which ultimately leaves their platform susceptible to CSAM. Thorn is hoping to uproot those hesitations with Safer, an affordable software already used by five beta partners to identify, remove and report CSAM from websites.
Before the rise of the internet, law enforcement had nearly abolished CSAM from trade, typically transferred through mail. But since 2004, the National Center for Missing and Exploited Children (NCMEC) have witnessed a harrowing 10,000% increase in the number of children sexually exploited on the internet — with 450,000 child sexual abuse files reviewed in 2004, and a staggering 25 million files reviewed in 2015.
Thorn’s success is partially due to its continuous, global support from companies including Amazon Web Services, Google.org, Microsoft, Twitter, and Facebook, as well as their partnership with child protection organizations like, but not limited to, the NCMEC. With this type of collaboration, Thorn’s software is better able to sift through a multitude of online clues so law enforcement can identify victims and prosecute offenders more efficiently.
Julie Cordua, CEO of Thorn, explains to Scary Mommy that these offenders lived in isolation before the internet, but now, they feel emboldened partially due to circulated conversation through chat-rooms online. “They can find thousands of people just like them and that then normalizes their behavior, and they have these communities that say, ‘Oh, it’s not that bad. Go ahead and do that,” Cordua states.
Law enforcement finds most offenders are within close proximity to the victim (such as a babysitter, teacher, coach, family member, etc.), and because of this, there truly is no gold-standard type of child likely to be targeted by CSAM.
“The internet has created an opportunity for mass distribution of images and videos of child sexual abuse, and the majority are very young children, often under 12,” Ashton Kutcher states in a press release. “This extraordinary amount of funding towards the elimination of child sexual abuse material from the internet illustrates that investors and donors recognize this is a rapidly growing and dangerous issue that needs to be addressed.”
Offenders are abusing children, documenting it, and then sharing that documentation to countless others like themselves on the internet. And according to Cordua, sexual abuse materials aren’t the only type of media surfacing on these sites. Due to easy online anonymization and ample accessibility to camera phones, underage elicit sexting material is at an all time high and offenders are uploading this content as well.
Cordua hopes that with sexual exploitation awareness being raised amongst parents, we can then sow that awareness into our children and talk to them in an open way about what they may encounter or people may ask them to do online.
“Parents have to take a deep breath, have a non-judgmental conversation with children, and open that door so they feel safe talking to us about what is happening in their online lives,” Cordua tells Scary Mommy.
Thorn is optimistic this new funding will not only help with the engineering side of this new technology, but also the advocacy portion of CSAM as well.
“There is a layer of education we’re going to have to go through in each market, in each country, of elevating the discussion of this as a major issue so that it can motivate people and companies and law enforcement to take action,” Cordua commented.
Even with law enforcement, units who specialize in issues such as CSAM are often under-funded. Not to mention, there seems to be a lack of urgency, because victims cannot scream through a virtual world for help.
“I think that the biggest hurdle is going to be really raising the bar of awareness, so that there is more of a willingness to participate in the situation,” Cordua added.
Technology’s advancements continue to make great strides in other scenarios, but in this case, Cordua’s fear is that it’s created a “hot bed for abuse.” The way CSAM are being viewed has changed along with technology. Therefore, law enforcement cannot rely on the same methods they once did to identify these victims and prosecute offenders. But Thorn is initiating a drastic change in the way we globally identify victims of CSAM online.
Introduced by @aplusk, @juliecordua of @Thorn is working to eliminate child sexual abuse online. She has a ground-breaking new approach to taking down abusive content & the abusers who share it. Visit https://t.co/BQxPa7VfOI to learn how to help. #TED2019 https://t.co/bZARh8b2MQ
— TED Talks (@TEDTalks) April 18, 2019
With the funding from The Audacious Project and continual donations from those eager to get involved, Thorn is eradicating CSAM with one identified victim at a time.