Key Takeaways
- The UK government has announced plans to introduce a legal framework to regulate the use of facial recognition technology by police forces.
- The use of predictive policing tools by UK police forces has been criticized for violating human rights and reinforcing existing discrimination.
- The Met Police has deployed permanent live facial recognition cameras in Croydon, despite concerns about the technology’s impact on Black communities.
- Proposed reforms to police data protection rules could undermine law enforcement data adequacy with the European Union.
- An investigation has revealed that the EU’s law enforcement agency has been quietly amassing data to feed an ambitious-but-secretive AI development programme.
Introduction to Police Technology
The use of data-driven technologies such as facial recognition and predictive policing has become increasingly prevalent in UK police forces. In 2025, Computer Weekly’s police technology coverage focused extensively on developments in these areas, including the Met’s decision to deploy permanent live facial recognition cameras in Croydon and the Home Office’s launch of a formal consultation on laws to regulate the use of facial recognition. The consultation marks a distinct shift in Home Office policy, which has previously claimed that a comprehensive legal framework for police facial recognition already exists. However, the current rules governing police use of facial recognition have been criticized for being complicated and difficult to understand.
Facial Recognition and Predictive Policing
The use of facial recognition technology by UK police forces has been criticized for its potential to infringe human rights. A report by Amnesty International found that predictive policing tools, which are based on profiling people or groups before they have committed a crime, can lead to the repeated targeting of poor and racialized communities. The report argued that these systems violate human rights and reinforce existing discrimination. Green Party MP Siân Berry has also argued that predictive policing technologies should be prohibited in the UK, as they infringe human rights and can lead to the over-policing of certain communities. The Met Police’s decision to deploy permanent live facial recognition cameras in Croydon has been criticized for its potential impact on Black communities, who are already disproportionately represented in police data sets.
Police Data Protection Rules
Proposed reforms to police data protection rules could undermine law enforcement data adequacy with the European Union. The government’s Data Use and Access Bill seeks to amend the UK’s implementation of the EU Law Enforcement Directive, which is transposed into UK law via the Data Protection Act 2018. However, the bill’s proposed amendments could present a challenge for UK data adequacy, as they allow for the routine transfer of data to offshore cloud providers and remove the need for police to log justifications when accessing data. The European Commission has warned that the decision to grant "data adequacy" to the UK may be revoked if future data protection laws diverge significantly from those in Europe.
Police Hyperscale Cloud Use
The use of hyperscale cloud infrastructure by UK police forces has also raised concerns about data protection. Documents obtained from the Scottish Police Authority revealed that Microsoft is refusing to provide critical information about its data flows, citing "commercial confidentiality." This has made it difficult for policing bodies to satisfy the law enforcement-specific data protection rules laid out in Part Three of the Data Protection Act 2018. Further revelations published by Computer Weekly showed that policing data hosted in Microsoft’s hyperscale cloud infrastructure could be processed in over 100 countries, which could have serious implications for data subjects.
EU Law Enforcement Agency’s AI Development Programme
An investigation by freelance journalists Apostolis Fotiadis, Giacomo Zandonini, and Luděk Stavinoha has revealed that the EU’s law enforcement agency has been quietly amassing data to feed an ambitious-but-secretive AI development programme. The investigation raised serious questions about the implications of the agency’s AI programme for people’s privacy across the bloc and the impact of integrating automated technologies into everyday policing without adequate oversight. The use of AI in law enforcement has the potential to infringe human rights and reinforce existing discrimination, and it is essential that any development programme is subject to rigorous scrutiny and oversight.
Equality Impact Assessment
An equality impact assessment created by Essex Police for its use of live facial recognition has been criticized for its poor methodology and inconsistencies. The assessment relied on false comparisons to other algorithms and "parroting misleading claims" from the supplier about the LFR system’s lack of bias. The UK’s equality watchdog, the Equality and Human Rights Commission, has argued that the use of facial recognition technology by police forces is unlawful and should only be used where necessary, proportionate, and constrained by appropriate safeguards. The commission highlighted the potential impact of facial recognition technology on people’s rights, particularly in the context of protests, where it could have a "chilling effect" on people’s freedom of expression and assembly.
Conclusion
The use of data-driven technologies such as facial recognition and predictive policing by UK police forces has raised significant concerns about human rights and data protection. The government’s plans to introduce a legal framework to regulate the use of facial recognition technology are a welcome step, but it is essential that any framework is subject to rigorous scrutiny and oversight. The use of hyperscale cloud infrastructure by police forces has also raised concerns about data protection, and it is essential that any development programme is subject to rigorous scrutiny and oversight. Ultimately, it is crucial that the use of technology in law enforcement is subject to rigorous scrutiny and oversight to ensure that it does not infringe human rights or reinforce existing discrimination.


