The use of automatic facial recognition (AFR) technology by South Wales Police has been ruled unlawful by the Court of Appeal, in a case that has ramifications beyond law enforcement, with one lawyer describing it as ‘cautionary tale’ for all organisations.
It followed a legal challenge brought by civil rights group Liberty and Ed Bridges, 37, from Cardiff, who had claimed that being identified by AFR while Christmas shopping and at an arms trade protest in 2018 had caused him distress.
South Wales Police said it would not appeal the findings despite Bridges’ case being dismissed at the High Court last year, which had decided the use of the technology was not unlawful.
On two key counts the court found that Bridges’ right to privacy under article 8 of the European convention on human rights had been breached, and on another it found that the force had not looked into whether the software was race or gender biased.
The court also, however, found its use was proportionate interference with human rights because the benefits outweighed the impact on Bridges.
South Wales Police has been using the technology since 2017 to pinpoint the movement of suspects by matching faces at events. Potential and false matches were being kept on the system for weeks, the court heard.
Liberty alleged there was no clear guidance on where AFR Locate could be used and who could be put on a watchlist. A data protection impact assessment was found to be deficient and no reasonable steps had been taken by the police to ascertain whether the software had a racial or gender bias.
Responding to last week’s ruling, South Wales Police chief constable Matt Jukes said the testing of the principles behind the technology’s use was an important “step in its development. I am confident this is a judgment that we can work with”.
Bridges said the technology was an “intrusive and discriminatory mass surveillance tool” and its use had breached his human rights once his biometric data was analysed without his knowledge or consent. The police force confirmed that Bridges had never been a person of interest and had never been on a watch list.
He added: “For three years now, South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge.
“We should all be able to use our public spaces without being subjected to oppressive surveillance.”
The technology maps faces in a crowd by measuring the distance between features, then compares results with a “watch list” of images – which can include suspects, missing people and persons of interest.
This is not necessarily a repudiation of the use of tech to help make things more efficient, nor is it the death knell for artificial intelligence” – David Lorimer, Fieldfisher
It had been trialled by South Wales Police since 2017, predominantly at big sporting fixtures, concerts and other large events across the force area.
David Lorimer, director of HR and technology at Fieldfisher, said the case was not the setback for AI that some had claimed. “This is not necessarily a repudiation of the use of tech to help make things more efficient, nor is it the death knell for artificial intelligence as plenty of people who have done little research into AI, RPA and automation will no doubt claim,” he said.
“What it is, is a cautionary tale of what to do and what not to do when adopting this kind of tech, to avoid potential discriminatory impacts and to protect subjects’ rights. There is a path through, organisations just need to think carefully about what that looks like and how they can act in a legally compliant and ethically sound way.”
According to the BBC the force remained “completely committed to its careful development and deployment” and was “proud of the fact there has never been an unlawful arrest as a result of using the technology in south Wales”. It said its use had resulted in 61 people being arrested for offences including robbery and violence, theft and court warrants.
During the remote hearing last month, Liberty’s barrister Dan Squires QC argued that if everyone was stopped and asked for their personal data on the way into a stadium, people would feel uncomfortable.
It is time for the government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom – it needs to be banned” – Megan Goulding, Liberty
“If they were to do this with fingerprints, it would be unlawful, but by doing this with AFR there are no legal constraints,” he said, as there are clear laws and guidance on taking fingerprints.
Squires said it was the potential use of the power, not its actual use to date, that was the issue and there were a lack of safeguards to ensure the use of such technology remained proportional.
Liberty lawyer Megan Goulding went further: “It is time for the government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom – it needs to be banned.”
Lorimer added: “Many employers will already use, or may plan to roll out solutions which use AFR, RPA and AI. Therefore the judgment against South Wales Police (which announced that had failed to comply with its obligations in respect of privacy, data protection and anti-discrimination law) is relevant and applicable to a HR situation.
“For instance, many employers rely on recruitment tech which records candidates’ answers to particular questions and automatically scans the recording for keywords, tone and facial expressions, using them to mark the candidates. It has been reported that, in many cases, such platforms are less reliable at identifying expressions for black, Asian and minority ethnic candidates and female candidates – which will rightly be of significant concern to diligent employers.
“As the same potential biases were discussed at length by the Court of Appeal, and formed the basis for its findings, those platforms and solutions will be under the microscope.”
HR Director opportunities on Personnel Today