Nearly 100 policy and rights groups push Apple to abandon plans to scan iPhones for child abuse

Groups including the Center for Democracy & Technology, the ACLU and the EFF say Apple's plans weaken user privacy.

Ian Sherr Contributor and Former Editor at Large / News
Ian Sherr (he/him/his) grew up in the San Francisco Bay Area, so he's always had a connection to the tech world. As an editor at large at CNET, he wrote about Apple, Microsoft, VR, video games and internet troubles. Aside from writing, he tinkers with tech at home, is a longtime fencer -- the kind with swords -- and began woodworking during the pandemic.
Ian Sherr
3 min read

A number of international policy groups say Apple's push is misguided.


A coalition of more than 90 US and international organizations sent an open letter to Apple CEO Tim Cook on Thursday, urging him to halt the company's plans to build new child safety features into its iPhones, iPads and Macs. The new capabilities, which Apple plans to release as part of free software updates in the coming months, could be twisted into tools of surveillance, the group warned.

"Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children," the group said in the letter, whose signatories include the Center for Democracy & Technology, the American Civil Liberties Union, the Electronic Frontier Foundation and Privacy International.

Apple declined to respond to the letter directly. Since announcing its plans earlier this month, the company has published research papers and technical breakdowns of its technology in an effort to allay concerns that many privacy and security advocates have raised.

Read more: How Apple's decision to expose more child abusers might affect you

"It's really clear a lot of messages got jumbled pretty badly in terms of how things were understood," Craig Federighi, Apple's head of software engineering, said in an interview with The Wall Street Journal published last Friday. "We wish that this would've come out a little more clearly for everyone because we feel very positive and strongly about what we're doing."

The public letter further dramatizes the charged debate Apple ignited when it announced child safety technology being built into its  upcoming iOS, iPad OS, WatchOS and Mac software. Apple said it developed the technology with privacy in mind. One of its new features scans images and videos being sent or received by children's accounts set up through its messenger app. Apple plans to obscure the message, and warn that it may be explicit. If the child-user still sends or decides to view the received message, parents can be alerted -- though Apple itself isn't. Privacy advocates say they're worried Apple's system may incorrectly flag health information or educational resources.

A different and more controversial feature that Apple is building bucks typical industry practice of scanning images and videos that are uploaded to the internet for known copies of child exploitation. Instead, Apple said it wants to scan a portion of what's on our devices, and in a privacy protecting way.

Read more: Apple's plan to scan phones for child abuse worries privacy advocates

Many privacy and security experts have come out against those plans too, though. They say that though Apple's intentions may be good, its technology could be twisted into tools of surveillance by totalitarian governments. "We can expect governments will take advantage of the surveillance capability Apple is building into iPhones, iPads and computers," Sharon Bradford Franklin, co-director of the CDT Security & Surveillance Project, said in a statement. "They will demand that Apple scan for and block images of human rights abuses, political protests, and other content that should be protected as free expression, which forms the backbone of a free and democratic society."

Apple has promised it would refuse any demands to expand its detection system beyond known images of child exploitation, though in the past it's also said it follows local laws in the countries it operates. Apple has said its system is designed so that only specified child advocacy organizations can change what's in its scanning database. The company has also promised to publicly share identifying code for each update to its database as a way to ensure people's devices haven't been tampered with.