X

Google releases AI tool to identify child sex abuse images online

Free API will help reviewers remove illegal material quicker, Google says.

Steven Musil Night Editor / News
Steven Musil is the night news editor at CNET News. He's been hooked on tech since learning BASIC in the late '70s. When not cleaning up after his daughter and son, Steven can be found pedaling around the San Francisco Bay Area. Before joining CNET in 2000, Steven spent 10 years at various Bay Area newspapers.
Expertise I have more than 30 years' experience in journalism in the heart of the Silicon Valley.
Steven Musil
2 min read
PA Images via Getty Images

Google on Monday released a free artificial intelligence tool to help companies and organizations identify images of child sexual abuse on the internet.

Google's Content Safety API is a developers' toolkit that uses deep neural networks to process images in such a way that fewer people need to be exposed to them. The technique can help reviewers identify 700 percent more child abuse content, Google said.

"Quick identification of new images means that children who are being sexually abused today are much more likely to be identified and protected from further abuse," engineering lead Nikola Todorovic and product manager Abhi Chaudhuri wrote in a company blog post Monday. "We're making this available for free to NGOs and industry partners via our Content Safety API, a toolkit to increase the capacity to review content in a way that requires fewer people to be exposed to it."

The use of AI is spreading like wildfire across the tech industry for everything from speech recognition to spam filtering. The term generally refers to technology called machine learning or neural networks that's loosely modeled on the human brain. Once you've trained a neural network with real-world data, it can, for example, learn to spot a spam email, transcribe your spoken words into a text message or recognize a cat.

Internet Watch Foundation, which aims to minimize the availability of child sex abuse images online, applauded the tool's development, saying it will make the internet safer.

"We, and in particular our expert analysts, are excited about the development of an artificial intelligence tool which could help our human experts review material to an even greater scale and keep up with offenders, by targeting imagery that hasn't previously been marked as illegal material," Susie Hargreaves, CEO of the UK-based charity, said in a statement. "By sharing this new technology, the identification of images could be speeded up, which in turn could make the internet a safer place for both survivors and users."

Crazy images caught on Google Street View

See all photos
Watch this: Google knows where you are

Solving for XX: The tech industry seeks to overcome outdated ideas about "women in tech."

Security:  Stay up-to-date on the latest in breaches, hacks, fixes and all those cybersecurity issues that keep you up at night.