X

ACLU decries face-recognition tools

Face-recognition technology designed to help catch known criminals proves to be ineffective during a two-month period, according to a report from the civil liberties organization.

Stefanie Olsen Staff writer, CNET News
Stefanie Olsen covers technology and science.
Stefanie Olsen
3 min read
Face-recognition technology designed to help catch known criminals proved ineffective during a two-month period, according to a report released Thursday by the America Civil Liberties Union.

Using state open-record laws, the civil liberties organization examined system logs at a Florida police department. It discovered that over a two-month period, the software failed to identify a single person photographed in the department's criminal database, according to the report. It added that the software produced many false identifications.

The ACLU said it also found that the police department, which installed the software in the Ybor City neighborhood of Tampa, Fla., stopped using the technology in August. The software from New Jersey-based Visionics was halted in use just two months into a 12-month trial.

Representatives from Tampa police department were not immediately available for comment.

However, according to the report, Tampa police officials said they stopped using the system because of "disruptions in police redistricting." Police officials planned to resume its operation in the future, the report said.

Visionics, one of the biggest makers of face-recognition systems, disputed statements that the police department stopped using the technology and that it is ineffective.

"Maybe there was no criminal present in Ybor City at the time the system was turned on," said Frances Zelazny, a Visionics spokeswoman.

"This is an investigative tool for law enforcement to use--just like a metal detector at an airport. The system does not go and put handcuffs on anyone. A human operator has to verify the match," she said.

Face-recognition software has become more widespread following the Sept. 11 attacks in New York and Washington, D.C., in which terrorists eluded airport security to hijack four commercial airplanes. As a result, airports including Boston's Logan International Airport and Oakland International Airport, in California, adopted the surveillance technology as a security precaution.

Visionics signed a deal late last year with conglomerate Tyco International to distribute its technology at some 100 of the nation's 450 commercial airports. Even the U.S. Army recently licensed face-recognition technology from rival Viisage Technology to create custom high-security applications.

Such face-recognition software uses biometrics, or the digital analysis of biological characteristics such as facial structure, fingerprints and iris patterns taken through cameras or scanners. The data then matches profiles to databases of people such as suspected terrorists.

But civil rights advocates have cautioned that out-of-date photos and poor lighting could result in numerous misidentifications. They have also warned that the technology's widespread adoption could come at the cost of civilian privacy. Instead of protecting public safety and apprehending terrorists, the technology may become a tool for spying on citizens as they move about in public places, they say.

"Face recognition is all hype and no action," Barry Steinhardt, associate director of the ACLU and an author of the report, said in a statement. "Potentially powerful surveillance systems like face recognition need to be examined closely before they are deployed, and the first question to ask is whether the system will actually improve our safety. The experience of the Tampa Police Department confirms that this technology doesn't deliver."

According to the ACLU report, the Florida system logs showed many false matches between people photographed by police video cameras and images in the department's database of criminals, sex offenders and runaways. It said the software matched male and female subjects and people with significant differences in age or weight.

Visionics disputed those findings.

"They're talking about misidentifications, implying that people were falsely arrested, and that's not the case. There are measures in place that if there's a false alarm, there's someone there to verify the match," Zelazny said. "That the system is blind to race is a privacy enhancing measure."

Security and privacy expert Richard Smith said the report carries a different tune than accolades coming from the companies selling the technology.

"This is not exactly a ringing endorsement of the technology. It's important to look at real-world installations and not just listen to hype from the technology companies," he said.

"Is this the best way to spend the security money at airports? Tampa shows no."

Other government agencies, including the Immigration and Naturalization Service (INS), stopped using the technology after tests. In past years, the INS tested the software to identify people in cars at the Mexico-U.S. border.