TLDRs:
- Microsoft confirms Copilot AI read confidential emails without permission.
- The Office AI bug affected draft and sent emails for weeks.
- European Parliament disables AI tools over privacy concerns.
- Stock reacts cautiously as investors digest potential reputational risk.
Microsoft has acknowledged a significant privacy issue in its Copilot AI, part of its Microsoft 365 suite.
The bug allowed the AI-powered Copilot Chat to access and summarize confidential email content without customer consent for several weeks, raising concerns among business and government users alike.
The problem, first flagged by tech outlet Bleeping Computer, reportedly started in January. Even organizations with strict data loss prevention policies saw their sensitive emails processed by the AI. This incident highlights the growing pains of integrating AI tools directly into widely used productivity software.
Bug Impacts Drafts and Sent Messages
Microsoft’s own guidance identifies the issue as tracked under code CW1226324. Emails marked with a confidential label, whether in draft form or already sent, were inadvertently summarized by Copilot Chat. While Microsoft has stated that a fix began rolling out in February, it has not disclosed the total number of customers affected.
Copilot Chat is a feature for Microsoft 365 subscribers, offering AI-driven assistance across Word, Excel, and PowerPoint. Users can ask the tool to summarize documents, extract insights, or draft text, but the exposure of confidential email content marks a notable lapse in security controls, undermining trust in Microsoft’s AI deployment.
European Parliament Takes Precautionary Action
The privacy concerns triggered immediate responses in sensitive environments. Earlier this week, the European Parliament’s IT department blocked all Copilot AI functions on work-issued devices. Officials cited worries that the AI could inadvertently upload confidential correspondence to Microsoft’s cloud servers, potentially violating internal data protection protocols.
This precaution reflects a wider debate on how enterprise AI should handle private information. With regulators and large institutions increasingly scrutinizing AI, Microsoft faces pressure to demonstrate stronger safeguards and transparency to maintain customer confidence.
Stock Reacts Amid Reputation Concerns
Investors have responded cautiously to the news. Microsoft’s stock (MSFT) edged lower as analysts and traders weighed the reputational impact of the bug against the company’s strong overall financial performance. While the fix rollout mitigates immediate risk, uncertainty about the scale of the exposure could influence enterprise clients’ willingness to adopt AI features rapidly.
Analysts note that privacy lapses in productivity tools can have outsized consequences, particularly as corporate and government clients demand robust controls. The incident also serves as a reminder of the challenges tech giants face when integrating advanced AI capabilities with established enterprise software.
Looking Ahead
Microsoft is expected to provide more detailed guidance on the bug’s impact in the coming weeks. Stakeholders will likely scrutinize both the technical measures implemented to prevent recurrence and the company’s communication strategy around AI privacy.
For now, users of Microsoft 365 are advised to review Copilot Chat activity, ensure proper labeling of sensitive emails, and monitor updates from the company regarding the full deployment of the patch.


