A civil rights campaigner has expressed his delight after the Court of Appeal ruled police use of facial recognition technology interfered with privacy rights and data protection laws.
Ed Bridges brought a legal challenge against South Wales Police, arguing their use of automatic facial recognition (AFR) had caused him “distress”.
The 37-year-old had his face scanned while he was Christmas shopping in Cardiff in 2017 and at a peaceful anti-arms protest outside the city’s Motorpoint Arena in 2018.
In a ruling on Tuesday, three Court of Appeal judges ruled the use of the technology was unlawful and allowed Mr Bridges’ appeal on three out of five grounds he raised in his case.
The ruling does not prevent the force from using the technology but means it will have to make changes to the systems and policies it uses for AFR.
In a statement, Mr Bridges said he was “delighted” the court had found that “facial recognition clearly threatens our rights”.
He said: “This technology is an intrusive and discriminatory mass surveillance tool.
“For three years now, South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public spaces without being subjected to oppressive surveillance.”
South Wales Police said the test of their “ground-breaking use of this technology” by the courts had been a “welcome and important step in its development” and the force will give the findings “serious attention”.
Chief Constable Matt Jukes said: “The Court of Appeal’s judgment helpfully points to a limited number of policy areas that require this attention.
“Our policies have already evolved since the trials in 2017 and 2018 were considered by the courts, and we are now in discussions with the Home Office and Surveillance Camera Commissioner about the further adjustments we should make and any other interventions that are required.”
Mr Jukes added: “We are pleased that the court has acknowledged that there was no evidence of bias or discrimination in our use of the technology.
“But questions of public confidence, fairness and transparency are vitally important, and the Court of Appeal is clear that further work is needed to ensure that there is no risk of us breaching our duties around equality.”
The force is not intending to appeal against the judgment.
In the judgment, the judges said the High Court erred when it concluded the force’s interference with Mr Bridges’ right to a private life was “in accordance with the law” under human rights laws.
They ruled there was no clear guidance on where AFR Locate – the system being used by South Wales Police in an ongoing trial – could be used and who could be put on a watchlist, and this left too much discretion to police officers.
The ruling said: “The fundamental deficiencies, as we see it, in the legal framework currently in place relate to two areas of concern.
“The first is what was called the ‘who question’ at the hearing before us. The second is the ‘where question’.
“In relation to both of those questions, too much discretion is currently left to individual police officers.
“It is not clear who can be placed on the watchlist, nor is it clear that there are any criteria for determining where AFR can be deployed.”
It added the court is satisfied “that the current policies do not sufficiently set out the terms on which discretionary powers can be exercised by the police and for that reason do not have the necessary quality of law”.
In their ruling, Master of the Rolls Sir Terence Etherton, President of the Queen’s Bench Division Dame Victoria Sharp and Lord Justice Singh did find the use of AFR was proportionate under human rights laws as the potential benefits of it outweigh the impact on Mr Bridges.
The court also concluded a data protection impact assessment of the scheme was deficient and the force had not done all it could to verify that the AFR software “does not have an unacceptable bias on grounds of race or sex”.
The judgment noted there was no clear evidence that the software was biased on grounds of race or sex.
The judges said they hoped that “as AFR is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias”.
At a three-day Court of Appeal hearing in June, lawyers for Mr Bridges argued that the facial recognition technology interferes with privacy and data protection laws and is potentially discriminatory.
AFR is used by South Wales Police to live-capture the facial biometrics of large numbers of people and compare them with people on a “watchlist” – which can include suspects, missing people and persons of interest.
The facial biometric data of anyone whose image is captured on CCTV but does not generate a match is not retained.
Facial recognition technology is also used by the Metropolitan Police.
Mr Bridges’ case was dismissed at the High Court in September last year by two senior judges, who concluded that the use of the technology was not unlawful.
Mr Bridges, who the force confirmed was not a person of interest and has never been on a watchlist, crowdfunded his legal action and is supported by civil rights organisation Liberty, which is campaigning for a ban on the technology.
Megan Goulding, a lawyer for Liberty, said: “This judgment is a major victory in the fight against discriminatory and oppressive facial recognition.”
In a statement, the Metropolitan Police said: “The MPS approach to live facial recognition is different to the South Wales Police cases which were appealed.
“This reflects our policing needs, which given the complexity of keeping London safe and the different crime issues impacting the capital, are very different from those in South Wales.”
The force added: “We will carefully consider the judgment and act on any relevant points to ensure that we maintain our commitment to use facial recognition in a lawful, ethical and proportionate way.”