Key Takeaways
- The chief of West Midlands police, Craig Guildford, has apologized to MPs for providing incorrect evidence about the decision to ban Maccabi Tel Aviv football fans
- The incorrect evidence was produced by artificial intelligence (AI), specifically Microsoft Copilot
- The AI-generated intelligence included a fictitious match between Maccabi Tel Aviv and West Ham, which was presented to the council-led security advisory group
- The home secretary, Shabana Mahmood, is preparing to make a statement to MPs about the findings of a report into the decision to ban Maccabi Tel Aviv fans
- The incident highlights concerns about the use of AI in policing and the potential for errors or biases in decision-making
Introduction to the Controversy
The chief of West Midlands police, Craig Guildford, has apologized to MPs for providing incorrect evidence about the decision to ban Maccabi Tel Aviv football fans from attending a Europa League match against Aston Villa in November. The apology comes after it was revealed that the incorrect evidence was produced by artificial intelligence (AI), specifically Microsoft Copilot. As Guildford stated in an email to the home affairs select committee, "I had understood and been advised that the match had been identified by way of a Google search in preparation for attending HAC. My belief that this was the case was honestly held and there was no intention to mislead the committee." This statement highlights the complexity of the issue and the need for greater transparency and accountability in the use of AI in policing.
The Role of Artificial Intelligence
The incident has raised concerns about the use of AI in policing and the potential for errors or biases in decision-making. The fictitious match between Maccabi Tel Aviv and West Ham was included in police intelligence and presented to the council-led security advisory group, who made the decision to ban away fans. This raises questions about the reliability of AI-generated intelligence and the need for human oversight and verification. As Guildford noted, the inclusion of the fictitious match "arose as a result of a use of Microsoft Copilot", highlighting the potential risks of relying on AI-generated information.
Previous Statements and Apology
Guildford had previously told MPs that the force did not use AI and that the mistake regarding the West Ham match was made by "one individual doing one Google search". However, it has now been revealed that this was not the case, and the incorrect evidence was in fact produced by AI. Guildford has apologized for the error, stating that he would like to offer his "profound apology" for the mistake. This apology is a significant step towards acknowledging the error and taking responsibility for the actions of the police force.
Implications and Next Steps
The incident has significant implications for the use of AI in policing and the need for greater transparency and accountability. The home secretary, Shabana Mahmood, is preparing to make a statement to MPs about the findings of a report by His Majesty’s Inspectorate of Constabulary into the decision to ban Maccabi Tel Aviv fans. This report is likely to provide further insight into the circumstances surrounding the decision and the role of AI in policing. As the use of AI becomes more widespread in policing, it is essential that measures are put in place to ensure that AI-generated intelligence is reliable and accurate, and that human oversight and verification are in place to prevent errors or biases.
Conclusion and Reflection
The incident highlights the need for caution and careful consideration when using AI in policing. While AI has the potential to provide valuable insights and support decision-making, it is not a substitute for human judgment and oversight. As Guildford noted, "I had understood and been advised that the match had been identified by way of a Google search in preparation for attending HAC. My belief that this was the case was honestly held and there was no intention to mislead the committee." This statement highlights the complexity of the issue and the need for greater transparency and accountability in the use of AI in policing. Ultimately, the incident serves as a reminder of the importance of ensuring that AI is used in a responsible and transparent manner, with appropriate safeguards in place to prevent errors or biases.
https://www.theguardian.com/uk-news/2026/jan/14/west-midlands-police-chief-apologises-ai-error-maccabi-tel-aviv-ban
