POLICE have used racially biased facial recognition technology for more than a decade, the Home Office has admitted.
Testing by the National Physical Laboratory (NPL) found that, in some settings, the technology was more likely to incorrectly identify black and Asian people than their white counterparts.
It checks live footage of people’s faces as they walk past mounted cameras against image watchlists of known or wanted criminals, allowing officers to target individuals.
Analysts found that use at its lower setting delivered more false positives for black and Asian subjects than white people — 5.5 per cent, 4 per cent and 0.04 per cent respectively.
The rate for black women was particularly high at 9.9 per cent, their report revealed on Thursday, hours after Policing Minister Sarah Jones had described the technology as the “biggest breakthrough since DNA matching.”
The Association of Police and Crime Commissioners said that the findings showed an inbuilt bias and it “seems clear that technology has been deployed into operational policing without adequate safeguards in place.”
Liberty policy and campaigns officer Charlie Whelton said: “With thousands of searches a month using this discriminatory algorithm, there are now serious questions to be answered over just how many people of colour were falsely identified and what consequences this had.”
Big Brother Watch head of research and investigations Jake Hurfurt said: “It is beyond belief that the police have used a bias-riddled facial recognition system to regularly scan a database of millions of people’s photos for more than a decade.”
Officials insisted that there are manual safeguards, written into police training, operational practice and guidance, requiring all potential matches returned from the police national database to be visually assessed by a trained user and investigating officer.
But Black Activists Against Cuts co-founder Zita Holbourne told the Morning Star: “This is a ridiculous defence. Police forces are institutional racist — how can black communities be expected to place trust in this?
“Where was the consultation and engagement with black communities? Where is the race equality impact assessment?”
Ministers have announced a 10-week public consultation on whether police should be able to go beyond their records to access other databases to track down criminals.
The Home Office said that a new algorithm with “no statistically significant bias” would be tested early next year.



