X
CNET logo Why You Can Trust CNET

Our expert, award-winning staff selects the products we cover and rigorously researches and tests our top picks. If you buy through our links, we may get a commission. Reviews ethics statement

Amazon's Rekognition falsely matched lawmakers to criminals... again, ACLU says

Amazon says the ACLU is misrepresenting its facial recognition software.

Shelby Brown Editor II
Shelby Brown (she/her/hers) is an editor for CNET's services team. She covers tips and tricks for apps, operating systems and devices, as well as mobile gaming and Apple Arcade news. Shelby also oversees Tech Tips coverage. Before joining CNET, she covered app news for Download.com and served as a freelancer for Louisville.com.
Credentials
  • She received the Renau Writing Scholarship in 2016 from the University of Louisville's communication department.
Shelby Brown
2 min read
facial-recognition-face-id-password-6

The ACLU and Amazon butted heads Tuesday over facial recognition technology. 

James Martin/CNET

The American Civil Liberties Union of Northern California says facial recognition technology isn't ready to be used for law enforcement. This comes after a test of  Amazon's  Rekognition software wrongly flagged 26 California lawmakers as criminals, according to the ACLU. 

The ACLU tweeted about the test on Tuesday, saying the software matched 1 in 5 lawmakers to the mugshot of someone who'd been arrested. The test compared legislators' images with a database of 25,000 publicly available mugshots. The ACLU of Northern California also released an image of all the lawmakers it says the Rekognition system erroneously flagged at criminals. 

The ACLU image includes the text: "Yes on AB 1215. One false match is too many." AB 1215 is a state bill that aims to ban facial recognition software from being used on police body cameras. It was authored by Democratic Assemblyman Phil Ting, who was among the lawmakers wrongly matched by Amazon's system in the ACLU test.

"Imagine the real world implications," Ting tweeted on Monday.

aclu
ACLU

Amazon, however, says the ACLU is knowingly misrepresenting its technology.

"As we've said many times in the past, when used with the recommended 99% confidence threshold and as one part of a human-driven decision, facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking," an Amazon spokesperson said in an emailed statement.

A confidence score, according to Amazon, is a number between 0 and 100 that indicates the probability that a given prediction is correct.

The ACLU says Amazon is knowingly misleading the public about its facial recognition software.

"Amazon knows -- with 100% certainty -- that its law enforcement customers are using lower confidence scores or no score at all when using the company's system," Matt Cagle, technology and civil liberties attorney at the ACLU of Northern California, said in an email Wednesday. 

The Washington County Sheriff Office in Oregon is the only law enforcement entity currently listed as an Amazon Rekognition customer, according to the company's website. The Washington County Sheriff's Office didn't immediately respond to request for comment. 

Last year, the ACLU accused Amazon of the same privacy faux pas after comparing 25,000 criminal mugshots to members of Congress. The tool thought 28 different members of Congress looked like people who've been arrested. Afterwards, it came to light that the ACLU got its mugshot matches by using the Rekognition software at its default 80% confidence threshold setting. In other words, the software may have returned matches that it was 80% confident were correct. Amazon recommended a 95% plus confidence level for law enforcement agencies. 

Watch this: Backlash grows for police use of facial recognition (The 3:59, Ep. 562)