X

Fight against child sex abuse images requires smarter tech, Google expert says

Elie Bursztein says technology can make fighting these crimes more efficient.

Laura Hautala Former Senior Writer
Laura wrote about e-commerce and Amazon, and she occasionally covered cool science topics. Previously, she broke down cybersecurity and privacy issues for CNET readers. Laura is based in Tacoma, Washington, and was into sourdough before the pandemic.
Expertise E-commerce, Amazon, earned wage access, online marketplaces, direct to consumer, unions, labor and employment, supply chain, cybersecurity, privacy, stalkerware, hacking. Credentials
  • 2022 Eddie Award for a single article in consumer technology
Laura Hautala
2 min read
Cropped Hand Of Woman Holding Teddy Bear At Beach
Getty Images

Some of the worst jobs in technology involve finding and stopping the spread of child sexual abuse images. They're horrible because they require people to see images that can leave deep emotional scars.

What's more, technology is making this important job harder because cameras on phones make it easier to create more of this content which then changes hands on a vast internet black market. 

Elie Bursztein, who directs the anti-abuse research team at Google, says software and artificial intelligence can make it easier to find and stop the spread of these images. That's a crucial step in rescuing children and catching perpetrators.

"It's us up to us to build transformative technology," Bursztein said, "and keep our kids safe."

The challenge is getting harder. Child exploitation images exploded to 18.4 million in 2018, 6,000 times more than the 3,000 recorded in 1998, Bursztein said on Monday, citing data from the National Center for Missing and Exploited Children, an NGO that tracks child sexual abuse images in the US. That's just too much information for investigators and content reviewers to deal with efficiently, he said Monday at the Enigma Conference, a security and privacy meeting.

At Google, Bursztein partners with NCMEC, as well as Thorn, a company that builds technology to defend children from sexual abuse. He believes artificial intelligence tools and software can help counter the problem by helping investigators find images faster that will lead them to solve crimes. One potential tool would analyze images and tell investigators how much useful, identifying information, such as how many features of a person's face are visible.

That would keep them from having to sift through images that might include obscured faces and allow them to focus on images that provide useful information that would help them rescue a child or catch an abuser. 

Another possibility is a software interface that can help investigators prioritize images, allowing them to find images that lead to breakthroughs faster. Finally, deep learning, an artificial intelligence technique, could group together related images.

Technology won't replace human investigators, who will ultimately have to review flagged images, Bursztein said. Determining whether someone is involved in a crime can't be left to machines.

"It's a very serious accusation," he said, adding "we have to make sure."

Bursztein also said that some techniques might help blunt the emotional wear on investigators. A technique called a "style transfer" can make images less disturbing to look at by altering a photo to look more like an illustration. That will allow the human investigator understand what's happening in the image without exposing them to lifelike photographic detail.

The technique hasn't been tried yet so it's unclear how effective it will be. Still, Bursztein and his collaborators applied it to explicit and violent images and found style transfers lowered the emotional impact on human content moderators by 16 percent.