Using female voices for AI assistants hurts women, UN report says

Default voices reinforce the stereotype of women being "obliging, docile and eager-to-please helpers," according to the report.

Shelby Brown
Shelby Brown Staff Writer
Shelby Brown (she/her/hers) is a writer for CNET's services and software team. She covers tips and tricks for apps, operating systems and devices, as well as mobile gaming and Apple Arcade news. Shelby oversees Tech Tips coverage and curates the CNET Now daily newsletter. Before joining CNET, she covered app news for Download.com and served as a freelancer for Louisville.com.
Credentials She received the Renau Writing Scholarship in 2016 from the University of Louisville's communication department.
2 min read

Defaulting to Alexa instead of Alex may have larger consequences than you might think.

Chris Monroe/CNET

Digital voice assistants like Siri , Alexa and Google Assistant have, by default, a female voice. A new report from the United Nations says that's a problem. 

These default voices reinforce the stereotype of women being "obliging, docile and eager-to-please helpers" that are "available at the touch of a button or with a blunt voice command like 'hey' or 'OK,'" according to the UNESCO report, released Friday.

The smart-home gadgets, which are often referred to as "she" or "her," have no choice but to respond. Digital assistants also can't defend against abuse, which reinforces the idea that women are "subservient and tolerant of poor treatment," the UN report said. 

While Apple's Siri and Microsoft's Cortana offer male voices, the defaults for the digital helpers are female voices. Amazon's Alexa offers several accents, but all its voices are female. Google says that it's developed a variety of 10 voice offerings in the US and that when customers set up a Google Home device, they have a 50-50 chance of getting either a traditionally female-sounding voice or a traditionally male-sounding voice.

Watch this: The battle for the best smart display: Google Home Hub vs Amazon Echo Show

"Siri's 'female' obsequiousness -- and the servility expressed by so many other digital assistants projected as young women -- provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education," reads the report. It's titled "I'd blush if I could," which used to be Siri's response to being called a bitch. Apple in April updated Siri's response to "I don't know how to respond to that," according to the report. 

The trouble, the UN report says, is that women are often left out of the technology field. Though girls may start their lives interested in tech, the numbers drop significantly by the time they reach high school. There are fewer women in the room to help develop products like digital voice assistants and prevent potentially sexist decisions. And that can cause the gender gap in tech to widen, according to the report, which discusses ways to close the gender divide in digital skills.

Amazon and Apple didn't respond to requests for comment. Microsoft didn't have a comment.

Originally published May 22, 10:27 a.m. PT.
Update, 12:18 p.m.: Adds info on Google's voice assistant; includes "no comment" from Microsoft.

Watch this: Laura Dern and Booking.com team up to boost young women coders