What CDP actually gives you
The Microsoft Products and Services Data Protection Addendum, combined with the Volume Licensing terms and the Copilot-specific additions in the Microsoft 365 service description, together amount to three plain claims for a signed-in school user:
- Your prompts and responses are not used to train Microsoft's foundation models. This is the single claim that most schools care about most. It is contractual, not just marketing — the DPA language is explicit.
- Your data stays within your tenant's trust boundary. Copilot interactions are bounded to your organisation's Microsoft 365 environment. Another school's tenant cannot see your content; Microsoft engineers cannot casually access it; the same access controls that govern your email and OneDrive apply to Copilot artefacts.
- Microsoft acts as a processor under your existing DPA. You remain the controller. Microsoft's obligations — security, breach notification, sub-processor transparency, data subject request assistance — are the same ones already in force for the rest of M365.
What CDP is not: it is not magic, it does not change your obligations as controller, and it does not exempt Copilot from your school's Data Protection Impact Assessment. The DPIA still needs to cover why you are processing student data, on what lawful basis, with what data minimisation, for how long. CDP is a mitigation that reduces some risks; the DPIA is the document where those mitigations are weighed against the risks that remain.
Sign-in equals compliance
The single biggest point of failure in any Copilot rollout is a teacher using consumer Copilot by accident. It is startlingly easy to do: a browser that remembers the personal Microsoft Account, a colleague's shared link that opens copilot.microsoft.com rather than copilot.cloud.microsoft, a Chrome profile that was not fully signed out. The consumer service looks almost identical to the work one and behaves almost the same — but the data path is different and the contractual protections are different.
The visual cues are worth training staff to recognise. At the top right of copilot.cloud.microsoft, a signed-in work account shows a shield icon and a "Protected" badge; consumer Copilot shows the Microsoft Account avatar with no badge. Inside the M365 desktop apps the risk is much lower — those surfaces do not expose the consumer path — but the browser surface is where the line gets crossed. Policy-wise, the defensible position is to require signed-in work-account use for all Copilot interactions and, where your tenant controls allow, block the consumer domain (copilot.microsoft.com) via conditional access on school-managed devices. The Microsoft admin-centre documentation calls this "Access Protection for Copilot" and it is worth the thirty minutes it takes to configure.
EU Data Boundary and Schrems II
Microsoft's EU Data Boundary is a contractual commitment to store and process EU customer data inside EU regions, including for Copilot, with specific carve-outs that are published and updated periodically. In practice that means the overwhelming majority of Copilot prompts, responses, and metadata for a European tenant never leave the EU. Some limited telemetry and a small set of sub-processing operations may still transit to US infrastructure; the exact list is available from Microsoft's documentation and changes as the architecture evolves.
Underneath this sits the Schrems II reality. Microsoft is a US-headquartered company, so transfers of EU personal data to US processors (where they occur) require Standard Contractual Clauses, supplementary measures, and the ongoing analysis of US surveillance law that has shaped EU cloud compliance since 2020. Microsoft is also certified under the EU-US Data Privacy Framework, which provides a currently-valid adequacy basis for some transfers — though the legal stability of that framework remains the subject of ongoing debate in European courts and regulators.
The honest sentence your DPIA should contain: this is among the best-documented and most robust data-protection positions offered by any major cloud AI provider, and it is still a relationship with a US-headquartered processor with complex cross-border dependencies that the school should reference explicitly rather than hand-wave past. The EU Data Boundary materially reduces exposure; it does not produce a purely European data path, and treating it as such is the kind of compliance assertion that gets caught in a real audit.
Licensing — which seat does what
The four SKUs introduced in Part 1 have different feature access, and a clear read on them matters before procurement discussions:
- Copilot (free, consumer) at copilot.microsoft.com: web chat only, no integration with Word, Excel, PowerPoint, Outlook, or Teams. No Commercial Data Protection. Consumer terms.
- Copilot Chat with work/school account: web and Teams sidebar chat, CDP enabled, free with any Microsoft 365 licence. No in-app integration into Word or Outlook — that is the M365 Copilot SKU.
- Copilot Pro (consumer paid): image generation, priority model access, and longer context on the consumer side. Not covered by Commercial Data Protection; should not be used for school work.
- Microsoft 365 Copilot: the integration into Word, Excel, PowerPoint, Outlook, Teams, OneNote, plus tenant-grounded answers that can reference your own files. Approximately kr 350 per user per month as a tenant add-on, list price — verify current pricing with your Microsoft reseller, because education tenants often have framework-agreement discounts.
A phased rollout is the sensible pattern. Start with school leadership and subject leads for one term — typically fifteen to twenty-five seats — so you can build a genuine picture of where Copilot adds value in your specific school before committing to full-staff licensing. Broader rollout follows once the policy framework and staff training are in place. Most tenants do not need to buy seats for every teacher on day one; the free Copilot Chat tier covers enough of the baseline that a staged approach rarely leaves anyone stuck.
Student access and age limits
Microsoft's own age floor for Copilot with a work or school account is 13, aligned with the Microsoft 365 Education terms. Students below that age cannot be given direct Copilot access under those terms; teachers can and do use Copilot on their own accounts to prepare materials for younger classes, but the children themselves do not interact with the service. For students aged 13 and above, the tenant admin can gate which Copilot features are available — chat only, in-app only, or full access — through the admin centre.
A concrete recommendation for schools new to Copilot: start with teacher-only access for a full term, even if your licence would technically permit student access. The adult use cases in Part 2 give you more than enough to work with for a first rollout, and delaying student access until the staff policy and exemplars are stable avoids the hardest version of the question landing in your first month. When you do enable student access, combine it with age-appropriate lessons on AI literacy — prompting, hallucinations, source criticism — rather than flipping the switch and hoping.
Reader responsibility. Readers outside Sweden should verify their own national regulations before relying on any specific rule in this section. AI in education is evolving quickly and country rules differ significantly. Sweden in particular lacks a unified national AI-in-school guidance document as of April 2026 — Skolverket and IMY have confirmed joint work in progress, without a published date. Countries including the UK, France, Germany, and the Netherlands have already issued national guides, each with its own interpretation of age floors and student-data handling. This guide is a starting point for local adaptation, not a substitute for the published rules in your own country.
Five policy decisions for your school
- Require signed-in work/school account for all Copilot use. Where your tenant controls allow, block the consumer Copilot domain via conditional access on school-managed devices. This is the single behavioural change that protects the CDP boundary — every other policy assumes it is in place.
- Document the Copilot paths in your DPIA. Three paths to name explicitly: Microsoft 365 Copilot (paid, in-app), Copilot Chat with the work account (free, CDP), and — if permitted at all — any consumer Copilot use by staff on personal devices for clearly adult-only tasks. Cite the Microsoft DPA and the EU Data Boundary commitment as mitigations, and reference Schrems II as an unresolved risk category that the school has weighed.
- Produce a one-page "what to paste, what to avoid" staff guideline. Three concrete accept examples and three concrete avoid examples, tailored to your school — not generic. Pin it in the shared staff wiki and reference it in onboarding. A new teacher in August should find it in under a minute.
- Phase the rollout: leadership and subject leads first, then broader staff, then students (if permitted). The tenant admin flip that enables Microsoft 365 Copilot is instant; organisational readiness is not. Budget at least a term between each phase so lessons from one wave actually shape the next.
- Termly review. Microsoft ships Copilot updates monthly. A 30-minute standing calendar entry each term for the digital-lead — skim the release notes, update the staff guideline, flag anything that changes the DPIA — is enough to stop the policy drifting out of date. Guidelines that do not survive contact with next term's product update are not guidelines; they are wishes.