X

Border officials don't have data to address racial bias in facial recognition tech

CBP doesn't collect data that would determine if facial recognition mistakes are adversely affecting one racial group over another, an official explains.

Alfred Ng Senior Reporter / CNET News
Alfred Ng was a senior reporter for CNET News. He was raised in Brooklyn and previously worked on the New York Daily News's social media and breaking news teams.
Alfred Ng
3 min read
A woman boarding an SAS flight to Copenhagen goes through facial recognition verification system VeriScan at Dulles International Airport in  Virginia.

A woman boarding an SAS flight to Copenhagen goes through facial recognition verification system VeriScan at Dulles International Airport in  Virginia.

Jim Watson / AFP/Getty Images

Facial recognition technology is prone to errors, and when it comes to racial bias at airports, there's a good chance it's not learning from its mistakes.

Debra Danisek, a privacy officer with US Customs and Border Protection, talked to an audience Friday at the International Association of Privacy Professionals Summit about what data its facial recognition tech collects -- but more importantly, what data it doesn't collect.

"In terms of 'Does this technology have a different impact on different racial groups?' we don't collect that sort of data," Danisek said. "In terms of keeping metrics on which groups are more affected, we wouldn't have those metrics to begin with."

In other words, while the CBP does collect data that's available on people's passports -- age, gender and citizenship -- to help improve its facial recognition algorithm, it doesn't gather data for race and ethnicity, even when a passenger is misidentified.

So the CBP doesn't know when there's a mismatch based on a person's skin color. It's relying on reports from the Department of Homeland Security's Redress program to identify when that happens.  

"If they notice we have a pattern of folks making complaints this process, then we would investigate," Danisek said.

Watch this: Facial recognition is going to be everywhere

Gender and race pose a challenge for facial recognition. Studies have shown the technology has a harder time identifying women and people with darker skin. Civil rights advocates warn that the shortcomings could adversely affect minorities.

Several airports and airlines have rolled out the biometric tech across the US, offering a faster way to board your flights. The technology scans a traveler's face and matches it with a passport photo provided to the airlines by the State Department. It'll be used in the top 20 US airports by 2021. CBP says it has a match rate in the high 90th percentile, while a study from the DHS' Office of Inspector General found that it had a match rate closer to 85%.

Customs and Border Protection says the system is getting better. A spokesman for the agency noted that the OIG study drew from a demo in 2017 that looked at the potential for the Traveler Verification Service.

"In the current deployment of TVS," the spokesman said, "CBP has been able to successfully photograph and match over 98% of travelers who have photos in U.S. Government systems."

In addition, CBP is working with the National Institutes of Standards and Technology to analyze the performance of face-matching tech, "including impacts due to traveler demographics and image quality," the spokesman said.

A lack of diverse data is what led to racial bias with facial recognition to begin with. Experts have suggested that photo databases for facial recognition could be using more images of white people than people of color, which skews how effective the technology is for minorities.  

Jake Laperruque, a senior counsel at the Constitution Project, is concerned that the agency is turning a blind eye to the potential for racial bias at airports.

"The comments reflect a troubling lack of concern about well-documented problem of facial recognition systems having higher error rates for people of color," Laperruque said in an email. "CBP can't simply ignore a serious issue and take a 'see no evil approach' -- if they're not willing to confront serious civil rights problems and deal with them, they shouldn't be trusted to operate a program like this." 

Originally published May 6.
Updated May 8: Added comment from a CBP spokesman.