X

Apple reportedly tweaked Siri's responses to sensitive subjects like feminism, #MeToo

Developers who rewrote the Apple voice assistant's answers were advised to offer neutral responses about certain topics, according to The Guardian.

Sean Keane Former Senior Writer
Sean knows far too much about Marvel, DC and Star Wars, and poured this knowledge into recaps and explainers on CNET. He also worked on breaking news, with a passion for tech, video game and culture.
Expertise Culture, Video Games, Breaking News
Sean Keane
2 min read
iphone 11 trio, fanned out

Apple's Siri voice assistant was apparently rewritten to avoid the word "feminism."

Viva Tung / CNET

Siri is apparently in favor of "equality" but doesn't say the word "feminism," according to a new report.  Apple suggested that developers working to rewrite its digital voice assistant's answers to queries should "deflect" questions about supposedly sensitive topics like women's rights, The Guardian reported Friday. 

Leaked guidelines from June 2018 told developers to have Siri approach questions about subjects like feminism or the #MeToo movement by not engaging, but rather deflecting, informing and remaining neutral, according to the report. It noted that the guidelines were leaked by a contractor in an Apple program that checked the accuracy of Siri responses to real-life queries. That program recently ended after privacy concerns were raised about workers listening to Siri conversations.

Siri's previous responses to some questions were apparently more dismissive -- "I just don't get this whole gender thing" and "My name is Siri, and I was designed by Apple in California. That's all I'm prepared to say" -- according to The Guardian.

"Siri is a digital assistant designed to help users get things done. The team works hard to ensure Siri responses are relevant to all customers," an Apple spokesperson in an emailed statement. "Our approach is to be factual with inclusive responses rather than offer opinions."

In May, a United Nations report said that Siri, Alexa and Google Assistant having female voices by default is problematic because they reinforce the stereotype of women being "obliging, docile and eager-to-please helpers."

First published Sept. 6 at 5:44 a.m. PT.
Updated at 7:25 a.m. PT: Adds Apple statement.

Watch this: Fitbit adds Alexa powers, Apple changes Siri permissions

14 hidden iPhone features in iOS 13 you need to know about

See all photos