Microsoft introduced the Copilot Chat feature in Word, Excel, PowerPoint, and Outlook last year, but a recently discovered technical issue has raised serious concerns. According to reports, a flaw in Microsoft 365 Copilot enabled the AI assistant to generate summaries of confidential Outlook emails for several weeks without permission.
The problem reportedly bypassed organisations’ Data Loss Prevention (DLP) security policies, which are meant to protect sensitive information. Although the company has released a fix, the incident has sparked privacy concerns among users. Confidential labels in Outlook are typically used to secure sensitive communication.
How Copilot Accessed Confidential Emails
Microsoft told Bleeping Computer that the Microsoft 365 Copilot bug allowed the AI to summarise confidential emails starting in late January. The issue, tracked as CW1226324, was first identified on January 21 and affected the Copilot “Work tab” chat feature.
Due to the flaw, Copilot read and summarised emails stored in users’ Sent Items and Drafts folders. This included messages marked with confidentiality labels that should have prevented automated processing.
The bug also ignored Data Loss Prevention safeguards that organisations rely on to protect private data. Microsoft acknowledged that emails marked confidential were incorrectly processed by Microsoft 365 Copilot Chat. According to the company, the Work tab chat feature created summaries even though sensitivity labels and DLP protections were active.
Fix Released, Monitoring Continues
Microsoft attributed the issue to a coding error and began rolling out a fix in early February. The company says it is still monitoring the process and contacting some affected users to ensure the solution works properly.
Microsoft has not disclosed how many users were impacted. The incident comes as the company continues expanding AI features across Outlook, Word, Excel, and PowerPoint. It also introduced AI-powered shopping features for Copilot in the Edge browser last year.
