X

Apple delays plan to scan iPhones for child abuse images

The feature, intended as part of an update for iPhones, iPads and Mac computers this fall, raised a number of privacy concerns.

Ty Pendlebury Editor
Ty Pendlebury is a journalism graduate of RMIT Melbourne, and has worked at CNET since 2006. He lives in New York City where he writes about streaming and home audio.
Expertise Ty has worked for radio, print, and online publications, and has been writing about home entertainment since 2004. He majored in Cinema Studies when studying at RMIT. He is an avid record collector and streaming music enthusiast. Credentials
  • Ty was nominated for Best New Journalist at the Australian IT Journalism awards, but he has only ever won one thing. As a youth, he was awarded a free session for the photography studio at a local supermarket.
Ian Sherr Contributor and Former Editor at Large / News
Ian Sherr (he/him/his) grew up in the San Francisco Bay Area, so he's always had a connection to the tech world. As an editor at large at CNET, he wrote about Apple, Microsoft, VR, video games and internet troubles. Aside from writing, he tinkers with tech at home, is a longtime fencer -- the kind with swords -- and began woodworking during the pandemic.
Ty Pendlebury
Ian Sherr
3 min read
116-iphone-12-purple-2021
Sarah Tew/CNET

Apple on Friday delayed a set of features designed to protect children from sexual predators on some of its iPhones, iPads and Mac computers. The move follows criticism from privacy advocates and security researchers who worried the company's technology could be twisted into a tool for surveillance.

In a statement, Apple said it would delay its new tools to identify images of child abuse on its devices as well as features to warn children about sexualized messages sent by SMS or iMessage. Apple had announced the tools last month.

"Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," a company spokesman said. The company didn't respond to a request for further comment about when it plans to reintroduce these technologies.

Watch this: Apple pushes back on child abuse scanning concerns, NASA to test living on Mars

It was a surprise reversal by Apple, which has argued for weeks that its new features were built in thoughtful and privacy-protecting ways. Apple for years has promised its devices and software are designed with privacy in mind, representing an alternative to devices built with Google's Android software.  The company even dramatized that with an ad it hung just outside the convention hall of the 2019 Consumer Electronics Show, which said, "What happens on your iPhone stays on your iPhone."

"We at Apple believe privacy is a fundamental human right," Apple CEO Tim Cook has often said.

Still, that didn't calm policy and advocacy groups, nearly 100 of which signed an open letter to Apple asking it to reconsider implementing its technology shortly after it was announced.

"Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children," the group said in the letter, whose signatories include the Center for Democracy and Technology, the American Civil Liberties Union, the Electronic Frontier Foundation and Privacy International.

The technology encountered resistance in part because it warns parents and children when they might be sending or receiving a sexually explicit photo using its Messages app. Apple received complaints from policy and rights groups who said that the software could have "disastrous consequences for many children." Privacy experts, who agree that fighting child exploitation is a good thing, worried that Apple's technology moves might open the door to wider uses that could, for example, put political dissidents and other innocent people in harm's way.

"Apple's plan to conduct on-device scanning of photos and messages is one of the most dangerous proposals from any tech company in modern history," wrote Evan Greer, director of the advocacy group Fight for the Future, in a statement. "It's encouraging that the backlash has forced Apple to delay this reckless and dangerous surveillance plan, but the reality is that there is no safe way to do what they are proposing. Apple's current proposal will make vulnerable children less safe, not more safe. They should shelve it permanently."

Proponents of Apple's plans were disappointed by the company's pause on the technology's rollout, arguing that the tech giant had taken a thoughtful approach to a tough problem.