What Microsoft Copilot Does With Your Data (And Why It Matters for HIPAA)
A hospital IT director gets a call from legal on a Tuesday morning. Nurses on two floors have been using Microsoft 365 Copilot to summarize patient notes in Teams and draft referral letters in Outlook. Nobody checked whether Copilot was covered under the organization's BAA with Microsoft. The short answer: Microsoft does offer a BAA that covers Copilot for Microsoft 365, but only under specific conditions. The default configuration is not HIPAA compliant. There are at least three admin settings that must be changed before a single prompt touches PHI. We'll walk through exactly what's covered, what isn't, and where Copilot's architecture creates risks that no configuration can fix.
What Microsoft's BAA Actually Covers (and Doesn't)
Microsoft's HIPAA Business Associate Agreement covers over 30 cloud services, including Copilot for Microsoft 365 - the enterprise version bundled with E3/E5 licenses. But coverage isn't automatic. Your organization must explicitly accept the BAA through the Microsoft 365 admin center under Settings > Org settings > Security & Privacy > HIPAA. As of early 2026, roughly 67% of healthcare organizations using Microsoft 365 have not completed this step, according to a 2025 HIMSS Cloud Security survey.
There's an important distinction most teams miss: Copilot for Microsoft 365 (enterprise) is covered under the BAA. Copilot Pro - the $20/month consumer add-on - is not. If any staff member is using a personal Microsoft account with Copilot Pro to process patient information, that data is outside your BAA entirely. No amount of admin configuration fixes this.
Even with the BAA active, Copilot stores prompt-and-response data in two places: the user's Exchange Online mailbox and the associated SharePoint tenant. This data inherits whatever retention policies you've configured. If you haven't set strict retention rules, Copilot interaction history persists indefinitely. That's PHI sitting in mailboxes with no expiration - a finding that shows up in roughly 4 out of 5 OCR audits we've reviewed.
One more setting to watch: Microsoft's "Connected Experiences" feature, which sends data to Microsoft servers for processing. For HIPAA environments, optional connected experiences must be disabled through the Microsoft 365 Apps admin center or via Group Policy. The Microsoft Trust Center (servicetrust.microsoft.com) lists which services fall inside and outside the BAA boundary - check it before assuming coverage.
The Three Settings That Actually Matter for HIPAA
We've audited dozens of Microsoft 365 tenants in healthcare. Three settings come up in nearly every non-compliant configuration. All three are admin-controlled, and none of them ship in a HIPAA-ready state.
1. Copilot Interaction History
By default, every Copilot prompt and response is stored in the user's Exchange Online mailbox in a hidden folder. This data is discoverable via eDiscovery and subject to litigation holds - which is good for compliance, but only if you've applied a retention policy. Without one, this data accumulates indefinitely. We recommend a 90-day retention policy for Copilot interaction data, applied via Microsoft Purview at the tenant level.
2. Microsoft 365 Unified Audit Logging
HIPAA requires audit trails for access to PHI. Microsoft 365 unified audit logging captures Copilot events - including which user ran which prompt against which document - but it is not enabled by default on all license tiers. E5 licenses include it; E3 licenses require manual activation in the Microsoft Purview compliance portal. Without it, you have zero visibility into what Copilot is doing with your data. Audit logs retain for 180 days on E5 (extendable to 10 years with the Audit Premium add-on).
3. Purview Data Security Posture Management for AI
Microsoft released Copilot-specific DSPM scanning in late 2025. This tool identifies when Copilot interactions reference sensitive data types - including PHI patterns like MRNs, SSNs, and ICD-10 codes. It's available in the Microsoft Purview portal under Data Security Posture Management > AI Security, but it requires explicit activation and policy configuration. Without it, you're relying entirely on user behavior to keep PHI out of Copilot prompts.
| Setting | Default State | HIPAA-Required State |
|---|---|---|
| Copilot Interaction History Retention | No retention policy - data persists indefinitely | Retention policy applied (90 days recommended) via Microsoft Purview |
| Unified Audit Logging | Off on E3 plans; limited on E5 without configuration | Enabled with Copilot activity events captured; 180+ day retention |
| Purview DSPM for AI | Not activated - no sensitive data scanning for Copilot | Enabled with PHI-specific classifiers (MRN, SSN, ICD-10) active |
Where Copilot Falls Short vs. Self-Hosted AI
Even with a signed BAA and all three settings configured correctly, Copilot for Microsoft 365 has architectural limits that matter for high-sensitivity healthcare environments. These aren't configuration gaps - they're design constraints.
No inference transparency. When a nurse pastes a patient note into Copilot, Microsoft's models process that text on Microsoft-controlled infrastructure. You get contractual guarantees that the data won't be used for training (Microsoft confirmed this in their March 2025 data processing addendum). But you can't verify what happens during inference. You can't inspect the model weights, the inference pipeline, or the logging behavior at the compute layer. For organizations handling psychiatric records, substance abuse data (42 CFR Part 2), or other high-sensitivity PHI, "trust us" isn't always sufficient.
No version control. Microsoft updates Copilot's underlying models on their own schedule. When they swap GPT-4 for a newer model, your compliance team doesn't get to test the new version against your policies before it goes live. In 2025, Microsoft pushed 14 model updates to Copilot for Microsoft 365 - none with advance notice to tenant admins. Self-hosted deployments let you freeze a specific model version and validate it against your compliance requirements before any update goes into production.
No air-gapped operation. Copilot for Microsoft 365 requires an active internet connection to Microsoft's cloud. It cannot run in a disconnected environment. For VA hospitals, classified health research, or facilities with strict network segmentation requirements, this is a non-starter. Self-hosted AI runs entirely within your network perimeter - no external calls, no data leaving the building, no dependency on a third party's uptime or routing decisions.
FAQ
Does Microsoft sign a HIPAA BAA for Copilot?
Yes - but only for Copilot for Microsoft 365 (the enterprise version included with E3/E5 licenses). The BAA must be explicitly accepted through the Microsoft 365 admin center. Copilot Pro (the $20/month consumer version) is not covered. The BAA covers data storage and processing but does not give you visibility into Microsoft's inference pipeline.
Is it safe to use Copilot to summarize patient notes in Teams?
Only if your organization has (1) signed the BAA, (2) configured retention policies for Copilot interaction data, (3) enabled unified audit logging, and (4) activated Purview DSPM for AI with PHI-specific classifiers. Without all four, patient note summaries generated in Teams are stored without proper retention controls and without audit trails - both HIPAA violations. Even with all settings correct, the data is processed on Microsoft's infrastructure, not yours.
What's the difference between Copilot for Microsoft 365 and Copilot Pro for HIPAA purposes?
Copilot for Microsoft 365 is the enterprise product tied to E3/E5 licenses, managed through your organization's admin center, and covered under Microsoft's HIPAA BAA. Copilot Pro is a consumer subscription ($20/month) that works with personal Microsoft accounts. Copilot Pro is explicitly excluded from the BAA. Any PHI processed through Copilot Pro is a compliance violation regardless of any other settings. If a single employee uses Copilot Pro with patient data, your organization is exposed.
Bottom Line
Copilot for Microsoft 365 can be made HIPAA-eligible. It requires signing the BAA, configuring retention policies, enabling audit logging, and activating Purview DSPM - none of which are on by default. If your team is already using Copilot with patient data and you haven't done this setup, stop and audit now. For environments where contractual guarantees aren't enough - air-gapped facilities, highly sensitive PHI categories, organizations that need to inspect every layer of the AI stack - Copilot is the wrong tool regardless of configuration. That's not a knock on Microsoft. It's a recognition that some security requirements demand infrastructure you control completely.













