Rekognition is a facial recognition program Amazon has provided to law enforcement agencies in Florida and reportedly marketed to US Immigration and Customs Enforcement. The program has been criticized for not being completely reliable, often making mistakes with its matches. Last July, the ACLU found that Rekognition mismatched 28 members of Congress with known criminals using a database of 25,000 mugshots. Amazon disputed the ACLU's findings.
Facial recognition in general -- not just from Amazon -- has also been controversial because some see it as an invasion of privacy that's also open to technical problems. In China, for instance, a surveillance program mistakenly identified a face on a bus ad as a jaywalker. Despite the outcry, facial recognition is expanding to airports, concerts and grocery stores.
While facial recognition has seeped into our everyday lives, its use by government agencies is perhaps the the most problematic. More than 85 human rights groups have joined the ACLU to ask companies like Amazon, Microsoft and Google to stop selling facial recognition to governments. The main concern is that the technology could be used to discriminate and mistakenly punish innocent people.
Shareholders are the latest to echo that concern. A group of them sent Amazon a signed resolution, organized by technology policy nonprofit Open MIC, that calls for Amazon to refrain from selling Rekognition to government agencies until it can ensure the technology doesn't cause potential violations of human rights.
Amazon received the letter on Dec. 19, and the shareholder resolution is expected to go to a vote at the company's annual meeting this spring. Amazon declined to comment.
The Sisters of St. Joseph of Brentwood filed the resolution as shareholders and members of the Tri-State Coalition for Responsible Investment.
"As women religious, with institutional investments, we call on companies we hold to respect human rights in all they do. We're especially aware of the risks facing vulnerable populations," Sister Patricia Mahoney said in a statement. "We filed this proposal because we are concerned that Amazon has pitched facial recognition technology to Immigration and Customs Enforcement (ICE) and piloted its Rekognition with police departments, without fully assessing potential human rights impacts."
First published Jan. 17 at 6 a.m. PT.
Update Jan. 18 at 3:50 p.m. PT: Adds that Amazon has disputed the results of the ACLU's Rekognition test.
Security: Stay up-to-date on the latest in breaches, hacks, fixes and all those cybersecurity issues that keep you up at night.
Taking It to Extremes: Mix insane situations -- erupting volcanoes, nuclear meltdowns, 30-foot waves -- with everyday tech. Here's what happens.