Uk police microsoft copilot error mistake – Breaking News & Latest Updates 2026
Skip to main content

UK police blame Microsoft Copilot for intelligence mistake

Copilot invented a nonexistent football match that was included in an intelligence report.

Copilot invented a nonexistent football match that was included in an intelligence report.

STK259_MICROSOFT_COPILOT_2__B
STK259_MICROSOFT_COPILOT_2__B
Image: The Verge
Tom Warren
is a senior correspondent and author of Notepad, who has been covering all things Microsoft, PC, and tech for over 20 years.

The chief constable of one of Britain’s largest police forces has admitted that Microsoft’s Copilot AI assistant made a mistake in a football (soccer) intelligence report. The report, which led to Israeli football fans being banned from a match last year, included a nonexistent match between West Ham and Maccabi Tel Aviv.

Copilot hallucinated the game, and West Midlands Police included the error in its intelligence report without fact-checking it. “On Friday afternoon I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot [sic],” says Craig Guildford, chief constable of West Midlands Police, in a letter to the Home Affairs Committee earlier this week. Guildford previously denied in December that the West Midlands Police had used AI to prepare the report, blaming “social media scraping” for the error. Earlier this month he also initially blamed the mistake on a Google search result.

Maccabi Tel Aviv fans were banned from a Europa League match against Aston Villa in November last year, because the Birmingham Safety Advisory Group deemed the match “high risk” after “violent clashes and hate crime offences” at a previous Maccabi match in Amsterdam.

As Microsoft warns at the bottom of its Copilot interface, “Copilot may make mistakes.” This is a pretty high-profile mistake, though. We tested Copilot Vision recently, and my colleague Antonio G. Di Benedetto found that Microsoft’s AI assistant often “got things wrong” and “made stuff up.”

Microsoft hasn’t been able to confirm that Copilot was involved in this particular mistake, and in a statement to The Verge it makes it clear that the British police force should be reviewing the sources of information that Copilot provides. “We are not able to replicate what is being reported,” says Jeffrey Jones, senior director of communications at Microsoft. “Copilot combines information from multiple web sources into a single response with linked citations. It informs users they are interacting with an AI system and encourages them to review the sources.”

Update, January 14th: Article updated with Guildford’s previous comments about a Google search and Microsoft’s statement.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.