Microsoft has admitted that a bug allowed its Copilot AI to summarize customers’ confidential emails without their permission over several weeks.
The bug, first reported by Bleeping Computer, allowed Copilot Chat to read and outline the contents of emails starting in January, even if customers had data loss prevention policies that prevented them from bringing sensitive information into Microsoft’s large-scale language model.
Copilot Chat enables paying Microsoft 365 customers to use AI-powered chat features in Office software products such as Word, Excel, and PowerPoint.
Microsoft said the bug, which administrators can track as CW1226324, means “drafts and sent email messages with sensitivity labels applied are incorrectly handled by Microsoft 365 Copilot Chat.”
The tech giant said it began rolling out a fix for the bug in early February. A Microsoft spokesperson did not respond to requests for comment, including questions about the number of customers affected by the bug.
Earlier this week, the European Parliament’s IT department told lawmakers it had blocked AI functionality built into workplace-issued devices, citing concerns that the tools could upload potentially sensitive communications to the cloud.
