Skip to Content

How Did Microsoft Copilot Accidentally Share Private Emails With Third Parties — and Is the Fix Enough?

Did a Microsoft Copilot Bug Put Your Confidential Work Emails at Risk — What You Need to Know

Microsoft Copilot Exposed Private Emails: What Happened and Why It Matters

A flaw in Microsoft’s Copilot chat function allowed unauthorized parties to view confidential emails and file summaries, prompting Microsoft to release an emergency patch. The incident raises a serious question every organization using AI-assisted tools should ask: how much do you actually control what your AI sees and shares?

What the Bug Actually Did

The vulnerability let Copilot surface sensitive data — email contents and file summaries — in chat replies that third parties could read. In security terms, this is classified as an information disclosure flaw: the system did not need to be “hacked” in the traditional sense. Copilot’s own responses became the leak.

This mirrors a broader pattern seen in Copilot’s security history. The EchoLeak flaw (CVE-2025-32711, CVSS score: 9.3), discovered in January 2025, demonstrated that attackers could steal sensitive Microsoft 365 data — including emails, Teams conversations, and SharePoint content — simply by sending a malicious email, with zero user interaction required. That vulnerability was fixed server-side in May 2025 and required no action from users.

More recently, a Lasso Security investigation found that Microsoft Copilot could still access cached data from deleted or private repositories even after Microsoft applied a patch — meaning the fix was only partial at the time. Human users were blocked from retrieving that data, but Copilot retained access to it.

Microsoft’s Response

Microsoft confirmed the issue and released a patch. For the EchoLeak class of vulnerabilities, the fix was applied server-side, removing the burden of any manual update from end users or IT administrators. Microsoft has also stated it is implementing additional defense-in-depth measures to further strengthen its security posture.

However, a critical gap remains: the full scope of affected customers has not been publicly confirmed, and it is not yet clear whether every affected organization has been notified. Microsoft classified at least one related issue as “important” without a public disclosure or direct customer notification.

What Remains Uncertain

Two questions remain open. First, how many organizations were actively affected before the patch was applied. Second, whether partial fixes — like the one Lasso Security identified — leave residual exposure through Copilot’s backend data access. Jeff Pollard, VP and Principal Analyst at Forrester, put it plainly: once an AI tool is authorized to read email, schedule meetings, and send responses on your behalf, it becomes a high-value target for attackers.

What Organizations Should Do Now

You do not need to wait for Microsoft to notify you. Take these steps now:

  • Audit Copilot permissions — review what data Copilot can access across email, OneDrive, SharePoint, and Teams
  • Rotate or revoke any API keys or credentials that Copilot may have accessed, especially if your organization uses third-party integrations
  • Monitor Copilot audit logs — note that a separate vulnerability found Copilot could suppress its own audit entries when prompted to omit file links from responses
  • Apply all available Microsoft 365 updates — even when patches are server-side, ensure your tenant configuration reflects Microsoft’s latest security guidance
  • Follow Microsoft’s Security Copilot release notes for ongoing transparency about what has been patched

The broader lesson here is structural: AI tools that aggregate access to your most sensitive data operate at the intersection of convenience and risk. Treat Copilot’s data permissions with the same rigor you apply to privileged user accounts — because in terms of data exposure, the risk profile is equivalent.