X

Apple puts out six-page FAQ on child abuse photo-scanning tech

The company is responding to privacy concerns over an upcoming feature.

Eli Blumenthal Senior Editor
Eli Blumenthal is a senior editor at CNET with a particular focus on covering the latest in the ever-changing worlds of telecom, streaming and sports. He previously worked as a technology reporter at USA Today.
Expertise 5G, mobile networks, wireless carriers, phones, tablets, streaming devices, streaming platforms, mobile and console gaming
Eli Blumenthal
2 min read
116-iphone-12-purple-2021
Sarah Tew/CNET

Apple made waves last week when it announced it would be adding a feature in its upcoming software updates that will scan people's iPhones, iPads, Mac computers and Apple Watches for child sex abuse materials in the Photos app. On Monday, the company put out a new document hoping to allay privacy concerns. 

The six-page document, called "Expanded Protections for Children," is a frequently asked questions guide on the forthcoming feature. 

"At Apple, our goal is to create technology that empowers people and enriches their lives," the company writes in its opening overview, "while helping them stay safe. We want to protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM)."

After acknowledging that some people have had concerns about how it will do this, the company says it put together the document to "address these questions and provide more clarity and transparency in the process." 

Apple says that the CSAM protection, which scans photos, "is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images." It adds that even possessing those images is "illegal" in most countries, including the US. 

The company adds that the feature will only impact those who use iCloud Photos to store their pictures and "does not impact users who have not chosen to use iCloud Photos." 

Apple says that the feature will not have any "impact to any other on-device data" and that it "does not apply to Messages." It also stresses that it will refuse any demands from governments looking to expand the feature to include non-CSAM images. 

"We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future," the company writes. 

As for the accuracy of properly identifying people, Apple says that "the likelihood that the system would incorrectly flag any given account is less than one in one trillion per year," with the company conducting a "human review" before sending any report to the National Center for Missing and Exploited Children. Apple concludes that "system errors or attacks will not result in innocent people being reported to NCMEC."