AI in Education · Practical Guide

Apple Intelligence for teachers and school leaders

A practical, GDPR-aware guide to the AI features built into your iPad, Mac, and iPhone — what stays on the device, what doesn't, and what that means for classroom work.

Updated April 2026

Part 1

Get started

What runs Apple Intelligence, what it can actually do today, and — the important bit — how Apple's "on-device" claim actually works when you look closely.

Which devices run Apple Intelligence

As of April 2026, Apple Intelligence runs on a narrow slice of the Apple hardware you are likely to meet in a school. On iPad, you need an M1 chip or later — so an iPad Pro or iPad Air from 2021 onwards, or any recent iPad mini with Apple-silicon. On Mac, any machine with Apple-silicon (M1 or later) will run it. On iPhone, only the iPhone 15 Pro, iPhone 15 Pro Max, and the full iPhone 16 and 17 lines are supported. The standard iPhone 15 is not.

The practical point for Swedish schools: the iPads most commonly found in 1:1 programmes — the 9th and 10th generation standard iPads, and the older iPad Air with A-series chips — do not run Apple Intelligence. That is worth being honest about at the front of this guide. If your fleet is in that group, this material is useful for policy and planning, but not something you can hand to staff to try this afternoon.

Language support has expanded quarterly since launch. English came first; Swedish became available during 2025 alongside the other main European languages. Because Apple ships new languages and regional features on a rolling basis, verify the current state on your own device rather than trusting a document that is already a few months old. The simplest check: open Settings → Apple Intelligence & Siri. If that menu item appears at all, your device is compatible and you can see which languages are currently available to you.

Turning it on (iPad-first)

On a compatible iPad, the setup is short but the download is not. You toggle the feature on, and your iPad then pulls down a roughly 4 GB on-device model in the background. On decent school WiFi this takes thirty to sixty minutes. The settings panel will read "Preparing" during that window; you can keep using the iPad normally while it works.

  1. Open Settings on your iPad.
  2. Tap Apple Intelligence & Siri.
  3. Toggle Apple Intelligence on and confirm.
  4. Leave the iPad connected to WiFi and power while the model downloads.
  5. When "Preparing" disappears, Writing Tools, the new Siri, and Mail categorisation are live.

On Mac, the path is the same under System Settings → Apple Intelligence & Siri, and you need about 4 GB of free disk space. On iPhone, the path is identical to iPad with the same 4 GB footprint. You do not need to sign into anything new: Apple Intelligence runs under the Apple Account you are already using on the device.

One honest caveat before you proceed. The feature appears in Settings on any compatible device, but some sub-features are gated by region or by age — ChatGPT integration, for example, requires the user to be 18 or over and sits behind a separate opt-in. The guide returns to these limits in Part 3, where they matter for school policy.

Private Cloud Compute — what it actually is

The distinction that matters

Apple talks about Apple Intelligence as an "on-device" product, and for most of what you will do in a classroom that description is accurate. But not for all of it. There are three distinct places your request can be processed, and knowing which one is in play for a given task is the difference between an informed user and one who is trusting a slogan.

1. On-device processing. The majority of everyday Apple Intelligence operations run entirely on the iPad, Mac, or iPhone in front of you. Most Writing Tools actions — rewrite, proofread, shorten — fall here, as do basic Siri requests, Mail categorisation, and Notification summaries. Nothing leaves the device. No request, no input, no output touches Apple's servers or anyone else's. For the kind of quick parent-email rewrite a teacher does ten times a week, this is almost always what happens.

2. Private Cloud Compute (PCC). When a request exceeds what the on-device model can handle — a long summary, a complex Writing Tools operation, Image Wand generation — the system reaches for a more capable model running on Apple-controlled servers. Apple's published architecture for PCC has several distinctive properties: the request is encrypted end-to-end from your device to a PCC node; the servers run a published, auditable software image; the server discards the request data after returning its response, a property Apple calls stateless computation; and Apple publishes the server binary so independent security researchers can verify that what runs in production matches what was described. The full technical whitepaper sits at security.apple.com.

3. ChatGPT integration. This is a separate, opt-in path. When a user enables it, specific requests can be forwarded to OpenAI's servers. Apple anonymises the request in transit, but OpenAI receives the actual content of what was sent. This is a fundamentally different privacy model from the first two tiers and it deserves its own treatment; Part 3 returns to it in detail.

Two honest qualifications are worth stating here. First, PCC does have stronger architectural guarantees than a typical SaaS API call — the auditability story is real and it is unusual in the industry. Second, "Apple cannot inspect your request" is nevertheless a trust assertion that depends on Apple's implementation matching what they have published. The audit story is what makes that trust somewhat verifiable rather than purely reputational, but it is not the same as the data never leaving the device.

A useful rule of thumb: a quick rewrite of a two-sentence parent email almost certainly stays on the iPad. Asking Apple Intelligence to summarise a ten-page PDF almost certainly invokes Private Cloud Compute. Anything that touches ChatGPT is clearly signalled in the interface and requires an explicit tap.

Part 2

Work smarter

Six features that change day-to-day teaching work, with practical examples from real classrooms and a clear note on where each one processes your data.

1. Writing Tools

Writing Tools is the built-in rewrite, proofread, and summarise surface that appears wherever you can edit text on iPadOS — Mail, Notes, Pages, Safari text boxes, the Classroom app's messaging panel. It is not a separate app to open; it is a menu option that appears when you select text.

Classroom examples

  • Rewriting assessment feedback: A terse "Needs more detail" becomes warmer in two seconds via the Friendly tone option. The student reads it and actually engages with the comment instead of skimming past it.
  • Parent email in three registers: Write the draft once, then generate Professional, Friendly, and Concise variants. Pick the tone that matches the history you already have with that particular parent.
  • Tightening a learning intention: Paste a paragraph from the syllabus, request a summary, and end up with one sentence students can reread between activities without losing the meaning.
  • Proofreading Swedish: Works on Swedish text and catches common agreement errors and clumsy constructions. Not a substitute for a language teacher's eye, but useful for your own drafts before they go out.
  • Expanding a bullet list: Turn a five-bullet outline into a coherent three-paragraph email for the morning briefing without rewriting from scratch.

Where it runs

Short operations — rewrite, proofread, brief summarise — run on-device. Longer summaries across multi-page documents invoke Private Cloud Compute. Nothing is sent to ChatGPT unless you explicitly choose the "Compose with ChatGPT" option in the menu.

How to start

  1. Select any text in any editable field on your iPad.
  2. Tap Writing Tools in the callout menu that appears above the selection.
  3. Choose an action — Proofread, Rewrite, Friendly, Professional, Concise, Summarise, or Key Points.
  4. First use will ask you to confirm Apple Intelligence is enabled; after that it is a two-tap operation.

On Mac: Writing Tools is available via right-click on any selected text, or from the Writing Tools icon that appears in the menu bar when text is selected in a supported app.

2. Siri 2.0

The refreshed Siri uses on-device intelligence to hold context across requests, understand natural phrasing, and act inside apps — opening specific documents, extracting information, and scheduling follow-ups without you having to switch apps yourself.

Classroom examples

  • Context-aware scheduling: "Move the Year 9 parent meeting to next Thursday and email the parents." Siri updates Calendar and drafts the Mail message in one step.
  • Cross-app retrieval: "Find the lesson plan I wrote about photosynthesis last term." Pulls the Notes document without you hunting through folders or searching manually.
  • Quick definitions in class: Ask a technical question mid-lesson; the answer arrives without you leaving your lesson slides or breaking the flow for students.
  • Morning briefing: "What meetings do I have today and are there any unread parent emails?" Returns both in one response, so the first five minutes of the day become a triage rather than a scramble.
  • Confirm before destructive actions: When you ask Siri to send an email, the draft is shown first for approval. A sensible default for classroom use, where voice commands can easily be overheard by students.

Where it runs

The overwhelming majority of Siri 2.0 requests run on-device. Complex queries may invoke Private Cloud Compute. The ChatGPT handoff is only triggered on an explicit prompt ("Ask ChatGPT about…") and always shows a confirmation screen before sending.

How to start

  1. Hold the side button on your iPad, or say "Siri" if "Hey Siri" is enabled.
  2. Watch for the subtle light that wraps the screen edges — that visual cue indicates the updated Siri is listening.
  3. Speak naturally; you do not have to phrase requests as commands any more.
  4. Review the response on screen and confirm any action that sends an email, a message, or a calendar invite.

On Mac: Triggered via the menu bar icon or by pressing and holding the Globe key (the default shortcut). Same capabilities and the same on-device-first processing.

3. Mail categorisation and Priority

Apple Mail now sorts incoming messages into four categories — Primary, Transactions, Updates, and Promotions — and surfaces time-sensitive Primary messages at the top as Priority Mail. For a teacher with a flooded inbox after a field-trip announcement, the effect is immediate.

Classroom examples

  • Parent day fallout: After announcing a field trip, eighty parent replies arrive overnight. The Primary inbox surfaces the three that ask actual questions; the rest — confirmations, no-shows, thanks — are grouped below.
  • Finding the email you need: A parent asks "Did you get my email?" during pickup. Priority Mail has typically pinned it at the top already, so the answer takes five seconds rather than five minutes.
  • Conference notifications: Registration confirmations and receipts from professional development land in Transactions, where you only see them when you go looking for them.
  • Newsletter triage: EdTech newsletters accumulate in Promotions, not in your main attention. You can still read them; they just no longer interrupt the working day.
  • Unreading made easy: Swipe left on a Primary item to send it back to unread if it needs a reply later. The context of why you left it is preserved in the Primary category.

Where it runs

Entirely on-device. Categorisation uses the on-device model on your iPad; no email content is sent to Apple servers or to anyone else. This is one of the features with the cleanest privacy profile in the entire Apple Intelligence suite.

How to start

  1. Open the Mail app on your iPad.
  2. Tap the category name at the top of the inbox (it defaults to "Primary") to browse other categories.
  3. To turn categorisation off entirely, go to Settings → Mail → Categorisation and switch to the flat-list view.

4. Notification summaries

iPadOS now groups related notifications and shows a one-line summary of each group. A pile of twenty parent messages becomes one readable line that tells you what the group is about, so you can decide whether to open it now or later.

Classroom examples

  • Dawn catch-up: Open the iPad and see "5 messages from parents about tomorrow's trip" instead of five separate lines all demanding attention at once.
  • After-assembly sweep: A run of Teams messages from the department gets summarised as "Three updates about Year 10 assessment dates" — enough to know whether you need to read now or at lunch.
  • Weekend boundary: Monday-morning opening reveals "12 notifications, including 2 from the head teacher" — triage becomes obvious at a glance.
  • Group chat awareness: A long group thread about staff party logistics becomes a one-liner, so you do not have to scroll through forty messages to confirm you are not needed.
  • What it can't do: Summaries occasionally read an urgent message as routine. Apple's own guidance is clear on this point: do not rely on summaries for safeguarding-sensitive categories. Open the messages themselves when the stakes are real.

Where it runs

Entirely on-device. No notification content is processed off the iPad. As with Mail categorisation, this is a feature where the privacy picture is straightforward.

How to start

  1. Summaries are on by default for most app categories once Apple Intelligence is enabled.
  2. To fine-tune, open Settings → Notifications → Summarise Previews.
  3. Toggle individual apps on or off depending on whether you want their notifications summarised or kept in their raw form.

On Mac: Available but less prominent; macOS notifications are themselves a lower-visibility surface. Most teachers will feel the difference on iPad rather than on the Mac.

5. Notes and Image Wand

The Notes app has gained an Image Wand that turns rough sketches or short descriptive text into polished illustrations — for handouts, whiteboard-ready diagrams, or worksheet graphics. It is the first image-generation tool to sit inside a note rather than in a separate app.

Classroom examples

  • Whiteboard diagrams: A rough circle-and-labels sketch becomes a clean vector-style diagram you can drop straight into a Keynote slide without redrawing it.
  • Handout illustrations: Describe "a friendly robot reading a book" and Image Wand generates a classroom-appropriate illustration in seconds. Useful when you need a consistent visual style across a set of materials.
  • Worksheet headers: Generate a themed illustration for each week's worksheet pack without hunting through stock-image libraries or worrying about licences.
  • Visual anchors for a lesson: Quick, distinctive images make abstract concepts easier for students to reference during follow-up questions later in the unit.
  • Student-shared notes: When you share a Note with a colleague or a student, Image Wand-generated visuals travel with the document. No separate image files to manage.

Where it runs

Image Wand generation runs in Private Cloud Compute. Your sketch or description is transmitted encrypted to Apple's servers, processed, and returned; the request data is discarded after the response. See Part 3 for the full privacy picture, including what this means for school policy.

How to start

  1. Open Notes on your iPad and open a new or existing note.
  2. Tap the Apple Pencil circle icon, or the Image Wand button in the toolbar.
  3. Sketch a rough shape or type a short description of what you want.
  4. Tap Wand to generate; pick the version you like best from the results.

On Mac: Same Notes feature; trigger it via the Image Wand menu item. Mac generation feels slightly faster in practice but uses the same Private Cloud Compute path.

6. Visual Intelligence

Use the iPad camera — or, on newer iPhones, the Camera Control button — to point at real-world text, objects, or documents and get contextual actions. Translate, summarise, look up, identify. It is the feature that turns the camera into a research tool without you having to open a browser.

Classroom examples

  • Textbook page translations: Point at a paragraph in a Swedish geography textbook and get an English summary pitched for an EAL student in your class.
  • Lab equipment identification: An unlabelled piece of science equipment in a shared-resources cupboard — point, get the name, and a short description of what it does.
  • Museum-trip companion: Point at an exhibition panel during a school trip and get a summary in plain language pitched for Year 6 reading level.
  • Restaurant menu translation: Staff training day abroad — useful for personal use during travel, with the same on-device mechanics as the classroom cases.
  • What it doesn't do well: Faces, student ID cards, or anything that could constitute personal data — deliberately limited. The guide recommends not pointing Visual Intelligence at anything that contains student names or images.

Where it runs

Basic recognition and translation happen on-device. Longer summarisation invokes Private Cloud Compute. If you explicitly tap "Ask ChatGPT" for a deeper explanation, that specific query is forwarded to OpenAI with a confirmation screen. Part 3 addresses the privacy implications of each path.

How to start

  1. On newer iPhones, hold the Camera Control button on the side of the device.
  2. On iPad, open the Camera app and point it at recognisable content.
  3. Look for the Visual Intelligence button that appears in the camera interface when recognisable text or an object is in frame.
  4. Tap the button and choose an action — Summarise, Translate, Identify, or Ask.

On Mac: No direct equivalent — Visual Intelligence is a camera-first feature. Mac users get similar text-recognition through Live Text when images are imported, but the full Visual Intelligence surface is iPad and iPhone only.

Part 3

The GDPR angle — in practice

Why on-device processing matters under Swedish and EU data protection law, where Apple's privacy story holds up, and the one place it breaks.

What stays on your device

A meaningful share of Apple Intelligence runs without anything leaving the iPad or the Mac in front of you. Writing Tools operations on short text — proofreading, tone rewrites, a brief summary of a few hundred words — are handled by the on-device model. Mail categorisation and notification summaries are entirely on-device. Most everyday Siri requests stay local. Visual Intelligence performs basic recognition — text detection, common-object identification — without a round-trip to Apple's servers.

This matters under GDPR because the regulation defines processing very broadly — Article 4(2) covers almost any operation performed on personal data. But if personal data never leaves the controller's own systems (in practice, the teacher's iPad), the question of whether a third-party processor is involved narrows considerably. That does not exempt the operation from GDPR scrutiny, but it reduces the surface area of a Data Protection Impact Assessment (DPIA) in a way cloud-based AI services cannot.

That said, "on-device" is not magic. The iPad is a controller-owned device, but it still backs up to iCloud unless the backup is disabled, still syncs across other devices signed into the same Apple Account, and still participates in the broader Apple ecosystem. On-device processing reduces exposure. It does not eliminate it, and a school's policy should treat it as one mitigation among several rather than as a clean answer.

When Private Cloud Compute is used

Some requests exceed the capacity of the on-device model. Rewriting multi-paragraph text, summarising a long document, generating an image with Image Wand, complex Visual Intelligence summarisation, and the more demanding Siri queries all invoke Private Cloud Compute (PCC). Apple's own documentation — published at security.apple.com alongside a technical whitepaper on the PCC architecture — describes the path the data takes once the device decides to hand off.

Technically, the request is encrypted end-to-end to Apple's PCC servers. Those servers run a specific software image that Apple has published for independent audit, so researchers can verify that the code running in production matches the code Apple claims to run. The computation is performed, the response is returned, and — according to Apple's stated architecture — the request data is discarded once the response is delivered. There are no persistent server-side logs of request content, and Apple engineers have no privileged back door into live requests.

Honestly framed: PCC is still a data transfer to a processor. The architectural guarantees are stronger than a typical SaaS API call — most SaaS APIs log requests and could be inspected by engineering staff — but from a GDPR perspective it is processing by Apple, and it is a transfer to Apple's infrastructure (Apple operates PCC regions in Europe, but verify current regional availability with your procurement contact before relying on it in a DPIA). It should be referenced in the school's DPIA and lined up alongside the existing Apple Data Processing Addendum. One pragmatic sentence for your policy document: PCC is probably the strongest privacy architecture any major cloud-AI provider has deployed at scale — and it still isn't the same as nothing leaving the device.

The ChatGPT exception

This is where the on-device promise breaks

The ChatGPT handoff is triggered when a user explicitly taps Ask ChatGPT — surfaced by Writing Tools, Visual Intelligence, or Siri when the request exceeds what Apple's own models will handle. iPadOS shows a confirmation dialog first, and nothing is forwarded without the user accepting it. That consent step is real, and it is a deliberate design choice by Apple. It is also the point where the privacy architecture of the rest of Apple Intelligence stops applying.

What actually happens once the user accepts: the specific text or image is sent to OpenAI. Apple anonymises the request at the network and account level — OpenAI does not see the Apple Account identity — but OpenAI receives the content itself. OpenAI's own logging, retention, and training policies then govern that content, not Apple's. The protections described in the PCC whitepaper do not carry over to this path.

The GDPR dimension is significant. OpenAI is a US-based processor, and transfers of EU personal data to US processors require specific legal mechanisms — Standard Contractual Clauses, adequacy considerations, and the ongoing Schrems II analysis of US surveillance law. An opt-in dialog on an iPad does not substitute for the school's own legal analysis. The practical recommendation is therefore blunt: disable the ChatGPT integration on school-issued iPads by default through Apple School Manager or your MDM, and train staff to recognise the "Ask ChatGPT" prompt and decline it for anything student-related. This is the single most important configuration decision for any school rolling out Apple Intelligence.

What this means for student data

The working rule is simple: de-identify student data before any Apple Intelligence interaction, even when you believe the operation will run on-device. The distinction worth drawing is between teacher-work data — planning documents without student names, generalised parent emails, your own meeting notes — and student data, which includes names, grades, behavioural notes, SEN plans, safeguarding concerns, and anything that could identify a child or their family. Teacher-work data is generally fine to process on-device. Student data should be de-identified or kept out of Apple Intelligence entirely.

Four reasons on-device processing alone does not make student data safe. First, iCloud Backup may include the document you just processed, and from that point on the data's journey is governed by iCloud's terms rather than the on-device model's architecture. Second, classroom iPads and cover-teacher devices are often shared, so material one teacher processed may be visible to another. Third, accidental dictation pickup during a lesson can send snippets of student discussion into a transcript you did not intend to create. Fourth, the on-device model does learn usage patterns locally — not a data breach, but something to be aware of when the same device moves between teachers or across a role change.

A plain working heuristic for staff: if the document would be sensitive to send to a cloud service, apply the same discipline to Apple Intelligence interactions — even when you are told they run on-device. The habit is more reliable than a feature-by-feature mental model.

Five policy decisions for your school

  1. Disable "Compose with ChatGPT" by default via MDM. On school-managed iPads and Macs, use Apple School Manager or your MDM provider to set the default for the ChatGPT Extension to off. Staff can be trained to enable it deliberately for specific personal-device use cases, but the default should never be on.
  2. Document Apple Intelligence in your school's DPIA. Most processing is on-device, but Private Cloud Compute constitutes a data transfer to Apple and should be referenced alongside your existing Apple Data Processing Addendum. Cite Apple's published PCC architecture — the security.apple.com documentation and the PCC whitepaper — as a risk-mitigation factor, not as a replacement for your own analysis.
  3. Produce a one-page staff guideline on what to paste. Teacher-work content is fine; student data should be de-identified or avoided. Include three worked examples of acceptable prompts and three of prompts to avoid, with names and specifics drawn from your own school context. Pin it to your shared staff wiki and reference it in onboarding.
  4. Make iCloud Backup settings an onboarding topic. Staff should understand that the iPad's iCloud Backup may include documents processed by Apple Intelligence. For personal devices that also handle school work, this is legitimately a matter of per-teacher judgement — but the default should be conscious, not accidental, and the school should offer a clear recommendation.
  5. Build in a termly review. Apple ships Apple Intelligence features monthly; EU feature availability shifts; ChatGPT integration mechanics may evolve. A thirty-minute calendar entry each term for the digital-lead to skim Apple's release notes and update the staff guideline is enough to stop the policy drifting out of date.

FAQ

Questions teachers ask

Does my iPad support Apple Intelligence?

Apple Intelligence requires an iPad with an M1 chip or later — that means iPad Pro (M1 and up), iPad Air (M1 and up), and recent iPad mini models with Apple silicon. Most school-fleet iPads running A-series chips do not qualify. The quick check: open Settings and look for Apple Intelligence & Siri — if the menu is there, your device supports it. If your school's iPads don't make the cut, rushing to upgrade is rarely the right call. Most teachers gain more, faster, from Claude or Gemini in a browser than from waiting on a hardware refresh. Hardware is the smallest part of an AI strategy.

Is Apple Intelligence available in Swedish?

Swedish localisation rolled out during 2025, but Apple ships languages in quarterly waves, so verify the current language matrix on Apple's support pages before planning a staff rollout. A useful nuance: Writing Tools and Mail categorisation work on Swedish text even before full Siri localisation, because the on-device language model handles Swedish grammar and vocabulary competently. Siri 2.0's fully localised voice responses depend on a separate release track. For classroom purposes, treat written-Swedish features as ready today and expect spoken-Swedish Siri to keep improving over successive iPadOS updates through 2026.

Should I turn on the ChatGPT integration?

For personal, adult-only use, it's a judgement call. For school-issued iPads with any student-facing use, disable it by default through your MDM. The reason is straightforward: the ChatGPT integration sends specific requests to OpenAI, a US-based processor whose retention and training policies are different from Apple's — and not covered by Apple's Data Processing Addendum. It isn't inherently dangerous for general writing tasks, but it is a separate legal analysis your school should have made before enabling it. Start with it off. Enable it consciously, not accidentally. Part 3 walks through the policy implications in more depth.

Can I use Apple Intelligence with student data?

The working rule is the same one that applies to any AI tool: de-identify first. Even on-device processing deserves the same discipline you'd apply to a cloud service when the content involves student names, grades, behavioural notes, SEN plans, or safeguarding concerns. The reasons are practical — iCloud Backup may include the document, shared or substitute-teacher access means material doesn't always stay with you, and accidental dictation pickup can capture things you never meant to record. Teacher-work data — your planning documents, generalised parent communications, your own meeting notes — is generally fine to process on-device. Part 3 covers this in more depth.

How does this compare to Claude or Gemini?

Different tool class. Claude and Gemini are general-purpose chatbots built for open-ended thinking work — drafting, analysis, research, longer conversations where you're reasoning through something with the model. Apple Intelligence is an ambient assistant built into the device: Writing Tools, Mail categorisation, Siri-driven actions, Visual Intelligence. Choose based on the task. Claude or Gemini for "help me think through this". Apple Intelligence for "do this small polishing or retrieval thing quickly without leaving my flow". They're complementary rather than competitors, and most teachers end up using both — the chatbot on a laptop for the hard thinking, Apple Intelligence on the iPad for the in-context nudges.

What about Apple Intelligence on my personal iPhone — is that relevant for school use?

Relevant for the adult-productivity side: parent emails drafted on your commute, meeting prep on the bus, the ten-minute tidy-up of tomorrow's lesson plan while waiting for coffee. Not relevant for any student-facing work. Your personal iPhone is generally not covered by your school's data processing agreements with Apple, so student data should not touch it — the legal basis for processing simply isn't there. Keep student data on school-issued devices with proper MDM configuration. Treat personal-iPhone Apple Intelligence as a personal productivity gain, not as an extension of your school's AI tooling.

Does Apple Intelligence train on my data?

Apple's public stance is that they do not train their models on user data from Apple Intelligence requests — whether those requests are processed on-device or via Private Cloud Compute. PCC requests are stateless and discarded once the response is returned. The ChatGPT integration is a different matter: OpenAI's own training and retention policies apply to that specific subset of opted-in requests, unless you are on a plan where training is explicitly disabled, and those policies evolve, so verify OpenAI's current terms if this matters to your DPIA. The short version: Apple = no training by default. ChatGPT = depends on opt-in and plan.

Is Apple Intelligence covered by my school's existing data processor agreement with Apple?

Verify with your DPO and procurement contact. Apple's published Data Processing Addendum for Apple School Manager and education customers generally covers Apple services used as part of normal device operation, and Apple Intelligence as an extension of iPadOS is usually within scope for on-device processing and Private Cloud Compute. The ChatGPT integration is a different matter entirely — it is not covered by Apple's DPA, because it's a separate relationship between the user and OpenAI, and it requires its own legal analysis. Your school's DPIA should explicitly address the ChatGPT path as a distinct risk and document the decision to disable it by default.

Can I turn off specific features while keeping Writing Tools?

Yes. Each feature has its own toggle, which matters for a staff rollout where you want the master switch on but the noisier or less familiar features off. Mail categorisation lives under Settings → Mail → Categorisation. Notification summaries live under Settings → Notifications → Summarise Previews. Writing Tools itself is tied to the master Apple Intelligence toggle, but the specific ChatGPT-adjacent behaviour — "Compose with ChatGPT" and similar — has its own setting under Siri & Apple Intelligence → ChatGPT Extension. A sensible default for a staff rollout: master switch on, Writing Tools on, Mail categorisation off, ChatGPT Extension off.

Where do I get help if something goes wrong?

Start with Apple's own documentation at support.apple.com/apple-intelligence, and for the technical privacy architecture behind Private Cloud Compute, the Apple Security Research site at security.apple.com is genuinely well written. For school-specific questions — policy decisions, DPIA implications, rolling out across a staff room, EU AI Act alignment — reach out through my website (link in the footer). I offer consulting for schools, primarily in Sweden. For MDM-specific configuration, your school's managed-device provider (Jamf, Mosyle, Intune, Apple School Manager itself) will have Apple Intelligence documentation and the relevant MDM profile options.

Take the guide with you

Download the portable PDF — same content, premium layout, perfect for offline reading or sharing with your staff.

Download the Quick Start (2 pages)

Updated April 2026.

License

This work is licensed under CC BY-NC-SA 4.0. You may copy, share, and adapt this material for non-commercial purposes, provided you credit Johan Lindström and share any adaptations under the same license.