Best Cyber Monday deals still available COVID variant: What is omicron? Jack Dorsey steps down as Twitter CEO Apple Music Awards PS5 restock tracker Google Doodle honors Lotfi Zadeh, father of fuzzy logic

Apple to beta test iMessage feature that warns kids about nude imagery

The new software feature is part of a suite of tools Apple is building to fight child exploitation and abuse.


Several months back, Apple proposed tools to protect children, but the company has since delayed their release.

James Martin/CNET

Apple will begin beta testing a text messaging feature designed to protect children from sending or receiving nude images, the company said Tuesday. The new iMessage feature, which Apple adjusted after receiving feedback from critics, is part of a series of capabilities designed to fight child exploitation.

Apple's new iMessage feature will analyze attachments in messages sent to users marked as children to determine if the attachments contain nudity. Apple said it'll maintain message encryption as part of the process and that the feature is designed so that no photo or indication of detection leaves the device.

The tech giant also said it's changed how the system works. Initially, Apple intended to alert parents of children under the age of 13 if the kids viewed or sent a flagged image anyway. Now Apple will let the children choose whether to alert someone they trust. And the choice is separate from whether they view the image.


Apple's new system for child safety in iMessage.


Apple's move is the latest of the company's efforts to add child protection tools to its devices. Earlier this year, Apple announced plans to build its iMessage system and a feature to detect child exploitation imagery being stored on some Apple devices. Apple said it built technology that would keep its servers from scanning the images, as many other companies, including Facebook, Microsoft and Twitter, do today. Instead, the tech would scan images on the phone.

Apple's message detection feature is separate from its plans to scan for child exploitation imagery, which the company delayed in September.

As part of its messages feature, Apple said it's expanding guidance that its Siri voice assistant will provide when children or parents ask about problematic issues. That includes information about how and where to file reports about child exploitation.