X
CNET logo Why You Can Trust CNET

Our expert, award-winning staff selects the products we cover and rigorously researches and tests our top picks. If you buy through our links, we may get a commission. Reviews ethics statement

Amazon's Rekognition software lets cops track faces: Here's what you need to know

Amazon wants to provide facial recognition software to law enforcement and businesses. We have questions.

Ry Crist Senior Editor / Reviews - Labs
Originally hailing from Troy, Ohio, Ry Crist is a writer, a text-based adventure connoisseur, a lover of terrible movies and an enthusiastic yet mediocre cook. A CNET editor since 2013, Ry's beats include smart home tech, lighting, appliances, broadband and home networking.
Expertise Smart home technology and wireless connectivity Credentials
  • 10 years product testing experience with the CNET Home team
Ry Crist
10 min read

Amazon Rekognition is the company's effort to create software that can identify anything it's looking at -- most notably faces. 

Business organizations and, yes, law enforcement agencies are already licensing that software for their own use. That means that you don't need to use Facebook or buy a face-scanning iPhone or a fancy video doorbell from Google-owned Nest or Amazon-owned Ring in order for facial recognition to be a part of your everyday life. With Rekognition, maybe it already is.

image-from-ios-11

This is part of a CNET special report exploring the benefits and pitfalls of facial recognition.

And maybe you aren't OK with that. Civil liberties groups such as the ACLU have already raised concerns about the speedy adoption of facial recognition tech among US law enforcement agencies and the potential for its abuse, particularly against immigrants and people of color. Many -- including some of Amazon's own employees and shareholders -- want the company to hit the brakes.

The controversy caught the attention of Congress last year, and now, with the Senate recently proposing a bill that would limit businesses from collecting and tracking facial recognition data without consent, it seems that Rekognition might be in for a reckoning.

All of which is to say that it's a good time to dive in and get a better understanding of what Rekognition is, how it works and what it's being used for.

Read more: Facial recognition 101: Your face is your fingerprint

What exactly does Amazon Rekognition do?

Glad you asked. Let's start by looking at what Amazon says:

"Amazon Rekognition makes it easy to add image and video analysis to your applications. You just provide an image or video to the Rekognition API, and the service can identify objects, people, text, scenes and activities. It can detect any inappropriate content as well. 

"Amazon Rekognition also provides highly accurate facial analysis and facial recognition. You can detect, analyze and compare faces for a wide variety of use cases, including user verification, cataloging, people counting and public safety."

Like a lot of what Amazon is up to these days, Rekognition centers on artificial intelligence and machine learning. If Alexa is Amazon's effort to give AI ears and a voice, then Rekognition could be seen as the company's effort to give AI a sense of sight and the intelligence to recognize what it's looking at. The difference is that Alexa is built for consumers like you and me, while Rekognition is an enterprise offering intended for businesses and organizations.

All of that sounds simple enough, right? It's image- and face-detecting software that developers can license from Amazon for their own applications. But start thinking about the ways that businesses and organizations might be putting Rekognition to use -- and some of the ways that they might in the future -- and things get more complicated.

Watch this: Facial recognition: Get to know the tech that gets to know you

How does Amazon Rekognition work?

Amazon says that its Rekognition software is based on deep learning technology developed by computer vision scientists. It's actually two separate software tools, or API sets: Amazon Rekognition Image, which analyzes images, and Amazon Rekognition Video, which analyzes video.

Like other image recognition applications, Rekognition looks for common structural identifiers called "landmarks" in whatever it's looking at. With an apple, that might be the shape and color of the fruit, along with characteristics like the stem. With a face, it's the shape of the features and the distance between them.

Once it's scanned the evidence, the software assesses how confident it is that it knows what it's looking at. That confidence variable acts as a threshold for declaring a match -- one Rekognition user could say that anything above 75 percent confident is good enough to label as a positive match, while another user with a more high-stakes application might want to set the number at 99. The higher the confidence level, the more certain the software needs to be in order to declare a match.

A lot of that confidence is dependent on the quality and angle of the image in question, but software like this that's programmed to recognize what it's looking at has come a long way in recent years. That's thanks in no small part to intense research interest from the titans of tech, not just Amazon, but Apple, Facebook, Google, Microsoft, Samsung and others.

"Rekognition is always learning from new data, and we're continually adding new labels and facial recognition features to the service," Amazon says.

With facial recognition, it's important to note that Amazon doesn't keep its own database of faces to match against. Instead, it's up to the user to provide a "face collection" that they own and manage. For a photo storage service, that face collection could be the photos that users upload. For a law enforcement agency, the face collection could be an existing database of mugshots.

What does Amazon Rekognition cost?

Companies don't pay an upfront cost to use Rekognition. They pay as they go based on how much they use it.

"With Amazon Rekognition, you pay for the images and videos you analyze and the face metadata that you store," the company explains, adding that customers can analyze 5,000 images and 1,000 minutes of video per month for free during their first year using the service.

Rates after that vary based on region, but in the US, Rekognition customers pay 10 cents for each minute of video analyzed and $1 for every 1,000 images processed. Customers also pay to store the metadata from images and videos they analyze within Amazon's servers. Discounted bulk rates apply for customers who process more than 1 million images.

What's Amazon Rekognition being used for?

amazon-rekognition-people-pathing-code
Enlarge Image
amazon-rekognition-people-pathing-code

A sample of Rekognition's "People Pathing" code, used to track an individual's movement within a video feed.

Amazon

That really depends on who's using it (more on that in just a bit), but Amazon lists the following use cases as examples: 

Other uses include analyzing the demographic makeup or even the emotional state of whoever the software is looking at, as well as something Amazon calls "People Pathing." Like it sounds, People Pathing uses Rekognition to track specific people as they move within the frame of a video feed. According to Amazon, it's capable of tracking:

  • The location of the person in the video frame at the time their path is tracked
  • Facial landmarks such as the position of the left eye, when detected

Amazon did not respond when we asked if there was anything preventing any specific businesses, organizations or law enforcement agencies from using any of the different APIs for Rekognition's different use cases. The fair assumption is that all of Rekognition's tools are on the table for all customers acting in accordance with the Amazon Web Services (AWS) acceptable use policy.

Who's using it?

That's a good question, and one that gets to the heart of what makes Rekognition controversial.

Though the company has highlighted Rekognition's use by nonprofits such as Thorn and the International Centre for Missing & Exploited Children to find potential leads on missing children and human trafficking victims, Amazon doesn't disclose its Rekognition customers without their consent -- and it wouldn't tell me the number of total Rekognition customers, either. According to documents obtained by the American Civil Liberties Union in May 2018, the list includes multiple US law enforcement agencies. (We have more reporting on some of the ways law enforcement agencies in places like Oregon and Florida are already using Rekognition, which you can read about here).

Police around the world have been using facial recognition technology for years now, but the disclosure was still enough to raise questions about Rekognition's capabilities, about how it might be used and about who exactly was using it. Before long, the ACLU was calling for Amazon to stop selling its Rekognition software to governments and law enforcement agencies altogether.

"The rights of immigrants, communities of color, protesters and others will be put at risk if Amazon provides this powerful surveillance system to government agencies," said Shankar Narayan, the technology and liberty director of ACLU of Washington.  

Some of those concerns have even come from within Amazon itself. In June of last year, a group of Amazon employees released a letter to Amazon Founder and CEO Jeff Bezos calling on the company to implement strong transparency and accountability measures and to stop selling Rekognition services to law enforcement agencies. 

"We already know that in the midst of historic militarization of police, renewed targeting of Black activists and the growth of a federal deportation force currently engaged in human rights abuses -- this will be another powerful tool for the surveillance state, and ultimately serve to harm the most marginalized," the letter reads.

Amazon declined to comment in response at the time and didn't respond to a request for comment ahead of this piece, either.

false-matches-graphic-1

In July of 2018, the ACLU claimed that Amazon Rekognition mismatched these 28 members of Congress with mugshots of criminals.

ACLU

Where do things stand now?

The controversy hasn't subsided. The ACLU continues to press its argument that Amazon shouldn't be selling Rekognition to government law enforcement agencies, even releasing a report showing that the software misidentified 28 members of Congress as criminals when the confidence level was set to 80 percent. The false matches disproportionately affected people of color, the report notes

Amazon disputed that report, issuing the following testy rebuke via blog post:

"When Rekognition is used as recommended for public safety (with 99 percent confidence levels), the same reports that the ACLU claimed contained 5 percent error rates yielded 0 percent error rates. This is inconvenient for the ACLU's rhetoric, but these are also the facts."

Even so, in November 2018, eight Democratic members of Congress expressed their concerns with Rekognition in a letter to Amazon:

"Facial recognition technology may one day serve as a useful tool for law enforcement officials working to protect the American public and keep us safe," they wrote, "However, at this time, we have serious concerns that this type of product has significant accuracy issues, places disproportionate burdens on communities of color and could stifle Americans' willingness to exercise their First Amendment rights in public."

The issue isn't going away. In January of this year, a group of the company's own shareholders urged Amazon to stop selling Rekognition software to law enforcement agencies. More recently, in March, a group of prominent AI researchers, including experts from Microsoft, Google and Facebook, as well as the 2018 winner of the prestigious Turing Award, penned an open letter warning of inherent biases built into Rekognition and calling on Amazon to stop selling it to the police. 

Perhaps complicating things is Amazon's ownership of Ring, the makers of a popular line of smart video doorbells. Recent patents from the company lay out a vision for face-detecting video doorbells that keep an eye out for convicted felons, sex offenders and the like, then relay the information directly to police. Amazon has already begun to let owners of its popular line of Echo smart speakers use the things as makeshift security devices that listen for trouble whenever they're away from home.

"Amazon is dreaming of a dangerous future," the ACLU's Jacob Snow said in a statement, "with its technology at the center of a massive decentralized surveillance network, running real-time facial recognition on members of the public using cameras installed in people's doorbells."

"We are always innovating on behalf of neighbors to make our neighborhoods better places to live, and this patent is one of many ideas to enhance the services we offer," a Ring spokesperson said in a statement. "However, patents do not necessarily reflect current developments to products and services, and this patent certainly does not imply implementation. Privacy is of the utmost importance to us, and we always design our services to include strong privacy protections."  

What does Amazon say about all of this?

Amazon didn't offer any comments about Rekognition ahead of this story's publication.

This February, amid the criticism, Amazon wrote at length on the topic in a company blog post. Along with defending Rekognition, Amazon seemed to join companies such as Microsoft that have already called for greater oversight and transparency with regards to the ways facial recognition tech is being put to use.

"In the two-plus years we've been offering Amazon Rekognition, we have not received a single report of misuse by law enforcement," wrote Amazon Web Services' vice president of global public policy, Michael Punke. "Even with this strong track record to date, we understand why people want there to be oversight and guidelines put in place to make sure facial recognition technology cannot be used to discriminate. We support the calls for an appropriate national legislative framework that protects individual civil rights and ensures that governments are transparent in their use of facial recognition technology."

The blog post goes on to lay out some suggested standards for facial recognition, which I'll list here:

  • Facial recognition should always be used in accordance with the law, including laws that protect civil rights
  • When facial recognition technology is used in law enforcement, human review is a necessary component to ensure that the use of a prediction to make a decision does not violate civil rights
  • When facial recognition technology is used by law enforcement for identification, or in a way that could threaten civil liberties, a 99 percent confidence score threshold is recommended
  • Law enforcement agencies should be transparent in how they use facial recognition technology
  • There should be notice when video surveillance and facial recognition technology are used together in public or commercial settings

"New technology should not be banned or condemned because of its potential misuse," the blog concludes. "Instead, there should be open, honest and earnest dialogue among all parties involved to ensure that the technology is applied appropriately and is continuously enhanced."

"We will continue to work with partners across industry, government, academia and community groups on this topic because we strongly believe that facial recognition is an important, even critical, tool for business, government and law enforcement use."

Originally published March 19 and updated as new developments occur.