X
CNET logo Why You Can Trust CNET

Our expert, award-winning staff selects the products we cover and rigorously researches and tests our top picks. If you buy through our links, we may get a commission. Reviews ethics statement

Amazon's facial tech shows gender, racial bias, MIT study says

The Rekognition​ software struggled to identify gender among females and women with darker skin, according to the study.

Abrar Al-Heeti Technology Reporter
Abrar Al-Heeti is a technology reporter for CNET, with an interest in phones, streaming, internet trends, entertainment, pop culture and digital accessibility. She's also worked for CNET's video, culture and news teams. She graduated with bachelor's and master's degrees in journalism from the University of Illinois at Urbana-Champaign. Though Illinois is home, she now loves San Francisco -- steep inclines and all.
Expertise Abrar has spent her career at CNET analyzing tech trends while also writing news, reviews and commentaries across mobile, streaming and online culture. Credentials
  • Named a Tech Media Trailblazer by the Consumer Technology Association in 2019, a winner of SPJ NorCal's Excellence in Journalism Awards in 2022 and has three times been a finalist in the LA Press Club's National Arts & Entertainment Journalism Awards.
Abrar Al-Heeti
2 min read
Amazon
Roberto Machado Noa / Getty Images

Amazon's facial technology had a harder time recognizing the gender of darker-skinned women and made more mistakes identifying gender overall than competing technologies from Microsoft and IBM, according to an MIT study published Thursday.

Amazon's Rekognition software incorrectly identified women as men 19 percent of the time, according to the study. In addition, it incorrectly identified darker-skinned women as men 31 percent of the time, it says. Software from Microsoft, by comparison, identified darker-skinned women as men 1.5 percent of the time.

Matt Wood, general manager of artificial intelligence at Amazon Web Services, said that the study's test results are based on facial analysis, not facial recognition. Analysis, he said, can find faces in videos or images and assign generic attributes, such as the wearing of glasses. Recognition, he said, matches an individual's face to images in videos and photographs. The Rekognition technology includes both of these functionalities. 

"It's not possible to draw a conclusion on the accuracy of facial recognition for any use case – including law enforcement – based on results obtained using facial analysis," Wood said in a statement. 

Wood added that the study didn't use the latest version of Rekognition. Amazon, using an up-to-date version of Rekognition with similar data, found no false positive matches, Wood says.

Deborah Raji, an author of the study, said she and co-author Joy Buolamwini understand the distinction between facial recognition and facial analysis. 

"We make it clear in our paper that the task we chose to evaluate is the facial analysis task of binary gender classification," Raji said. "That means, given the number of faces detected, how well does the model understand what it sees?"

In a Friday blog post, Buolamwini cautioned people to be skeptical when companies say they have completely accurate systems.

"Wood states the company used a large benchmark of over 1 million faces to test their facial recognition capabilities and performed well," Buolamwini wrote. "While their performance on the benchmark might seem laudable, we do not know the detailed demographic or phenotypic (skin type) composition of this benchmark. Without this information we cannot asses for racial, gender, color, or other kinds of bias."

Amazon has provided Rekognition to law enforcement agencies, though civil liberties groupsmembers of Congress and Amazon's own employees have raised concerns about privacy. Earlier this month, a group of shareholders also called on Amazon to stop selling its Rekognition technology to government agencies. 

In light of the MIT study, Buolamwini said it's "irresponsible" for Amazon to keep selling the technology to law enforcement agencies. Facial analysis technology can be abused and could lead to mass surveillance, she said. In addition, inaccuracies could result in innocent people being misidentified as criminals.

Raji echoed that sentiment. "If the system falsely identifies a suspect due to its reduced accuracy on a particular demographic," she said, "that could be seriously harmful."

First published Jan. 25, 3:54 p.m. PT.
Update, 11:14 p.m.: Adds comment from Buolamwini and Raji.