Police use of facial recognition gets reined in by UK court

Facial recognition tests in public spaces violated privacy, didn't follow data protection procedures and didn't address possible bias, judges say.

Katie Collins Senior European Correspondent
Katie a UK-based news reporter and features writer. Officially, she is CNET's European correspondent, covering tech policy and Big Tech in the EU and UK. Unofficially, she serves as CNET's Taylor Swift correspondent. You can also find her writing about tech for good, ethics and human rights, the climate crisis, robots, travel and digital culture. She was once described a "living synth" by London's Evening Standard for having a microchip injected into her hand.
Katie Collins
3 min read

A close-up of a police facial recognition camera used in Cardiff, Wales.

Matthew Horwood/Getty Images

Since 2017, police in the UK have been testing live, or real-time, facial recognition in public places to try to identify criminals. The legality of these trials has been widely questioned by privacy and human rights campaigners, who just won a landmark case that could have a lasting impact on how police use the technology in the future.

In a ruling Tuesday, the UK Court of Appeal said South Wales Police had been using the technology unlawfully, which amounted to a violation of human rights. 

In a case brought by civil liberties campaigner Ed Bridges and supported by human rights group Liberty, three senior judges ruled that the South Wales Police had violated Bridges' right to privacy under the European Convention of Human Rights. They also said that data protection procedures hadn't been properly followed and that not enough had been done to address the potential for the technology to discriminate on the basis of race or sex.

The case brought by Bridges had originally been dismissed by High Court judges last year, but Bridges challenged this in the Court of Appeal, where his appeal was upheld on three of five grounds.

In a statement, Bridges said he was "delighted" by the decision. "This technology is an intrusive and discriminatory mass surveillance tool," he said. "For three years now South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public spaces without being subjected to oppressive surveillance."

Liberty lawyer Megan Goulding added that the judgment was "a major victory in the fight against discriminatory and oppressive facial recognition."

UK Independent Surveillance Camera Commissioner Tony Porter, whose role is to ensure compliance with the surveillance camera code of practice, also welcomed the court's decision.

"If there is to be an ethical and evolutionary process for the legitimate use of automated facial recognition (AFR) technology by the state then it is essential that the public have trust in the technology, its legal and regulatory controls and the honesty of endeavour by the police themselves," he said in a statement. The court's decision and findings are a key part of that evolutionary process, he added.

South Wales Police said it won't be appeal the ruling.

A death knell for police use of facial recognition?

For campaigners and for the police, the next step will be to determine how to interpret the ruling and establish whether police can continue to use the technology. The tech has so far been used by two police forces: South Wales Police and London's Metropolitan Police. 

Some privacy campaigners hope the decision is one step toward the technology being outlawed completely. 

"This is a huge step forward in the fight against facial recognition and should deter police from lawlessly rolling out other kinds of oppressive technologies they've been looking at," privacy organization Big Brother Watch said in a statement. "Now it's vital we achieve a legislative ban on live facial recognition to end this dangerously authoritarian surveillance for good."

Liberty is also calling for an outright ban on the use of the technology in public, but the court's decision doesn't call for this. The criticisms pointed out in the judgment are specifically about the unsuitable regulatory framework, failure to adhere to data protection laws and failure to address discrimination concerns. These are all things that can potentially be resolved to allow police to continue using the technology.

In his statement, Porter said he didn't believe the decision was "fatal" to police use of facial recognition technology in the UK -- instead it showed the importance of having clear legal and ethical parameters for its use. He called on the UK's Home Office and Secretary to update the surveillance camera code and "commission an independent review of the legal framework which governs overt state surveillance."

In its own statement, the South Wales Police said it didn't believe the ruling would mean it had to stop its own use of live facial recognition, which it says has resulted in 61 people being arrested for offenses including robbery and violence, theft and court warrants.

"There is nothing in the Court of Appeal judgment that fundamentally undermines the use of facial recognition to protect the public," said Deputy Chief Constable Jeremy Vaughan. "This judgement will only strengthen the work which is already underway to ensure that the operational policies we have in place can withstand robust legal challenge and public scrutiny."