Key Takeaways:
- The UK police force has been using a facial recognition system that is biased against women, young people, and members of ethnic minority groups.
- Despite knowing about the bias for over a year, police forces lobbied to use the system with a lower confidence threshold, which increased the number of potential matches but also increased the risk of false positives.
- The Home Office has admitted that the technology is biased and has taken steps to address the issue, including procuring a new algorithm with no statistically significant bias.
- The use of facial recognition technology is set to be widened, with the government opening a ten-week consultation on its plans.
- Critics have raised concerns about the priorities of police forces and the potential for the technology to compound racial disparities.
Introduction to Facial Recognition Technology
The use of facial recognition technology by police forces in the UK has been a topic of controversy in recent years. The technology is used to conduct retrospective facial recognition searches, whereby a "probe image" of a suspect is compared to a database of more than 19 million custody photos for potential matches. However, a review by the National Physical Laboratory (NPL) found that the technology was biased, misidentifying Black and Asian people and women at significantly higher rates than white men.
The Bias in Facial Recognition Technology
The bias in the facial recognition system was first identified in September 2024, after a Home Office-commissioned review by the NPL. The review found that the system was more likely to suggest incorrect matches for probe images depicting women, Black people, and those aged 40 and under. Despite this, police forces argued to overturn an initial decision to increase the confidence threshold required for potential matches, which would have reduced the bias. The National Police Chiefs’ Council (NPCC) documents show that the higher threshold reduced the number of searches resulting in potential matches from 56% to 14%. However, forces complained that the system was producing fewer "investigative leads" and the decision was reversed the following month.
The Impact of the Bias
The impact of the bias in the facial recognition system is significant. The recent NPL study found that the system could produce false positives for Black women almost 100 times more frequently than white women at certain settings. This raises serious concerns about the potential for the technology to compound racial disparities and undermine trust in the police. Critics have argued that the priorities of police forces are misguided, with Prof Pete Fussey, a former independent reviewer of the Met’s use of facial recognition, stating that "convenience is a weak argument for overriding fundamental rights, and one unlikely to withstand legal scrutiny".
Government Response and Consultation
The government has opened a ten-week consultation on its plans to widen the use of facial recognition technology. The policing minister, Sarah Jones, has described the technology as the "biggest breakthrough since DNA matching". However, critics have raised concerns about the lack of discussion about the rollout of facial recognition technology in the context of the police race action plan. Abimbola Johnson, chair of the independent scrutiny and oversight board for the police race action plan, said that "these revelations show once again that the anti-racism commitments policing has made through the race action plan are not being translated into wider practice".
Conclusion and Future Developments
The use of facial recognition technology by police forces in the UK is a complex and contentious issue. While the technology has the potential to support police in putting criminals behind bars, it also raises serious concerns about bias and the potential to compound racial disparities. The Home Office has taken steps to address the issue, including procuring a new algorithm with no statistically significant bias. However, critics have raised concerns about the priorities of police forces and the need for strict national standards and independent scrutiny. As the government consultation on the use of facial recognition technology continues, it is essential that these concerns are taken into account and that the technology is used in a way that prioritizes fairness, transparency, and accountability.


