The latest revelation comes amid fallout over Apple's newly revealed plans to scan for child abuse imagery on some people's devices.
Apple reportedly has been scanning some users' emails for child abuse imagery since 2019, according to a new report, adding new details to the ongoing debate about the company's stance on user privacy . Earlier this month, Apple said it would would implement a system to scan some people's iPhones , iPads and Mac computers for child abuse imagery, worrying security and privacy advocates who say the system could be twisted into a tool for government surveillance.
The company told the publication 9to5Mac it had been scanning iCloud Mail emails for child abuse imagery for the past two years, a detail it didn't appear to explicitly disclose to customers. Apple had said on earlier versions of its website that it "uses image matching technology to help find and report child exploitation" by looking at "electronic signatures" without providing more detail. Apple also told the publication it performed "limited" scanning of other data, without going into further detail other than to say it didn't include iPhone or iPad backups.
Apple didn't immediately respond to a request for further comment.
The latest revelation adds a wrinkle to the heated debate about Apple's approach to user privacy. For years, Apple's marketed its devices as more secure and trustworthy than those of its competitors. It's gone so far as to publicly criticize Google and Facebook over their ad-supported business models, telling customers that because Apple makes money by selling phones it doesn't need to rely on ad tracking and other tools to make money. Apple also mocked the tech industry with a billboard at the 2019 Consumer Electronics Show in Las Vegas, with a picture of an iPhone and the statement "What happens on your iPhone, stays on your iPhone."
When Apple announced its new scanning technology, it emphasized plans to run scans on devices using its iCloud photo library syncing service. The company said it preferred to run scans on the device rather than on its servers, saying it would allow privacy advocates to audit its systems and ensure they weren't being somehow misused.
"If you look at any other cloud service, they currently are scanning photos by looking at every single photo in the cloud and analyzing it; we wanted to be able to spot such photos in the cloud without looking at people's photos," Craig Federighi, Apple's head of software engineering, said in an interview with The Wall Street Journal earlier this month.
Though privacy advocates question Apple's moves, the effort comes amid a surge in child abuse imagery across the web. The number of reported child sexual abuse materials jumped 50% in 2020, according to a report from The New York Times, a majority of which were reported by Facebook. Apple's anti-fraud chief suggested the problem was even larger, saying in a private message that his company's commitment to privacy had led it to become "the greatest platform for distributing child porn." The message was made public as part of Apple's ongoing legal battle with Fortnite maker Epic Games.