Commissioner 'shocked' by West Midlands Police's use of AI for Maccabi fans ban - but maintains 'AI is a positive thing'
West Midlands Police and Crime Commissioner Simon Foster said he was "shocked" by the way artificial intelligence (AI) was used to justify a ban of Israeli football fans from Birmingham.
West Midlands Police came under fire when incorrect evidence garnered from Microsoft Copilot was used as part of a decision by the Birmingham Safety Advisory Group to bar Maccabi Tel Aviv fans from attending a Europa League game against Aston Villa in November.
The fallout resulted in the retirement of former West Midlands Police chief constable Craig Guildford, who had admitted misleading MPs after previously denying the use of AI.
At a West Midlands Police and Crime Panel, Mr Foster and acting chief constable Scott Green faced more questions about the issue from councillors.
Councillor Jilly Bermingham said: “The most important lesson going forward is the use of AI which was found to be a key failure within the decision.
“Can you assure us how you are going to make sure that use of AI does not create the same problem again, and how is AI being ensured to be safe from outside sources?”
Mr Foster said: “The acting chief constable has immediately turned off Microsoft Copilot within West Midlands Police in order to ensure the use is reviewed.
“I suspect it will not be turned back on until the acting chief constable is satisfied it can be relied on and properly regulated and managed.
“I have an ethics panel within the Office of the Police and Crime Commissioner which will be considering matters arising out of the use of AI by policing.
“Plainly, artificial intelligence is a positive thing. The Government white paper published only a couple of weeks ago makes it very clear it sees artificial intelligence in many respects as the future of policing.
“AI has to be managed, regulated, understood. We have to make sure it is used lawfully, ethically and responsibly.
“We need to make sure AI is not used in a way that causes inaccuracies and hallucinations and if you’re going to use AI it is properly checked, verified and it goes through the proper intelligence management structures that were established with Operation Parkmill which did not happen on this occasion.
“That indeed was significant failure, concern and shortcoming.”
Panel chair Councillor Suky Samra added: “I’m sure we’ve all seen the film Minority Report. We’re delving into AI predicting future crime, which in this particular incident it was used for.
“I take it you were quite shocked to learn it was used in this instance?”
Mr Foster said: “I was shocked to hear it wasn’t being used in a way that ensured that its use was properly managed and regulated and that the information being obtained was not being properly cross checked.
“It was not being filtered through the normal intelligence management procedures that had been set up as part of Operation Parkmill.”
Acting chief constable Scott Green said AI is not being used widely across the force.




