There's a privacy explanation for why Apple doesn't let you delete Siri recordings

Apple's voice assistant isn't tied to your account.

Alfred Ng Senior Reporter / CNET News
Alfred Ng was a senior reporter for CNET News. He was raised in Brooklyn and previously worked on the New York Daily News's social media and breaking news teams.
Alfred Ng
4 min read

The moment you activate Siri, Apple connects data to a random identifier the company can't find.

James Martin/CNET

Apple's reputation for respecting privacy was called into question last week with news that contractors listen to Siri recordings. That sparked an understandable clamor for better control over voice data collected by the digital assistant. If you own an Apple product, you might want the option to delete your recordings from the company's database. 

Here's the rub: Apple can't delete specific recordings. And that's to protect your privacy.

Unlike Google and Amazon, which collect voice data and associate it with an individual account, Apple's Siri recordings are given a random identifier each time the voice assistant is activated. That practice means Apple can't find your specific voice recordings. It also means voice recordings can't be traced back to a specific account or device. It may sound counterintuitive, but that's actually a privacy feature.

Apple landed in hot water after last week when The Guardian reported that contractors were listening to anonymized audio from conversations with Siri. Some of the dialogue included private details of people's lives, such as discussions with doctors and sexual encounters, according to the report. The audio was used to check the voice assistant's accuracy, a process Apple called "grading."

The resulting outcry caused Apple to change its policies. On Thursday, the tech giant said it would suspend the program and give people the ability to opt out of Siri recordings that are graded. (Amazon and Google have made similar moves.)

"We are committed to delivering a great Siri experience while protecting user privacy," Apple said in a statement. "While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading."

Watch this: How to use Siri's Shortcuts app

Apple won't, however, be giving people the opportunity to delete their recordings. When asked about the possibility of deleting recordings, Apple referred CNET to its iOS Security white paper from May for details

The paper explains why you can't delete Siri recordings the way you can with Google's Assistant or Amazon's Alexa. The voice assistant works differently than its rivals. 

"When Siri is turned on, the device creates random identifiers for use with the voice recognition and Siri servers," the paper says. "These identifiers are used only within Siri and are utilized to improve the service. If Siri is subsequently turned off, the device will generate a new random identifier to be used if Siri is turned back on."

This is the key difference between Siri and Google Assistant or Alexa. It's easy to delete data from Google and Amazon because your recordings are associated with your account. Go to your account settings to remove the recordings the tech giants have on you. Google and Amazon tie audio recordings to people's accounts because they can use it for personalization

Apple doesn't rely on ad revenue for its profits. It makes money by selling hardware and services, Sure, it has lots of audio data. But unless a user talks about personally identifiable information, Apple can't know whose data it is.

"Personally, I would prefer that method 10 times out of 10 as compared to my data being identifiable to a major corporation," said Ted Harrington, an executive partner at security company Independent Security Evaluators.

Apple HomePod plays music, controls your smart home

See all photos

Even if contractors heard a recording with personal information, they wouldn't be able to find more audio from a specific account. Finding a specific person's audio data from anonymized Siri recordings is like finding a needle in a haystack.

If I asked Apple to delete my specific audio data, it couldn't, because it wouldn't know which of the millions of Siri recordings it's collected belong to me. Even if I said on every recording, "This is Alfred Ng," somebody would have to listen to millions of recordings to find mine. 

When you download all the data Apple has on you, which is an option the company introduced in 2018, that doesn't include audio sent to Siri. 

Apple knows your sign-in records, data stored on iCloud, app usage details, your downloads and purchases from the App Store and marketing communications, but it has nothing on Siri data. Apple can't delete something it can't find.

A parade of privacy scandals has led to distrust of tech giants. Adding identifiers to audio data that's already anonymized, however, would give a false sense of security.

"The failure of some vendors to keep their privacy promises and play fast and loose with consumer data can make it hard for some people to accept assurances that their data has been properly anonymized and their privacy is truly protected," said Stephen Cobb, an independent security researcher. 

Apple's random identifier means you can't delete your recordings. It also means that privacy is built-in by default. You don't have to navigate through settings and delete your recordings one by one. 

"In the moments when I forget to delete my data or find myself too busy to delete it, I am authorizing [companies] to keep access to data that can be directly correlated to me," Harrington said. "By anonymizing the data up front, I've made the choice to allow the company to use the data for their business purposes but made it much easier on myself to avoid being exposed."

Ideally, there would be a system that allowed for both -- complete anonymity and the ability to control what data companies retain, he said. But you can't have your privacy cake and eat it too.