Concero Logo

Copilot & Gemini in Schools: What GDPR Compliance Really Looks Like

By Nathan Cleary • 18 January 2026

AI tools like Microsoft Copilot and Google Gemini are no longer on the horizon — they’re already here.

Teachers are using them to plan lessons faster. Leaders are exploring how they might reduce admin pressure. And schools everywhere are asking the same question:

Are Copilot and Gemini actually GDPR compliant?

The honest answer is: they can be — but only if they’re set up and used properly.

Let’s take the noise out of it and look at what schools really need to know.

Why This Question Matters

Schools don’t have the luxury of “move fast and fix later”.

You’re responsible for:

  • Pupil data
  • Staff data
  • Safeguarding
  • Compliance

So when AI tools arrive promising speed and efficiency, it’s right to pause and ask whether they fit safely into your existing responsibilities.

This isn’t about stopping innovation.

It’s about using it confidently and responsibly.

The Reassuring Bit: The Platforms Themselves

When used through their education environments, both Copilot and Gemini are built with GDPR in mind.

Microsoft Copilot

Used within Microsoft 365 for Education, Copilot:

  • Operates inside your school’s tenant
  • Doesn’t use your data to train public AI models
  • Respects existing permissions and access controls
  • Aligns with Microsoft’s GDPR and UK data protection commitments

Google Gemini

Used through Google Workspace for Education, Gemini:

  • Keeps data within your school’s managed environment
  • Doesn’t train consumer AI models on education data
  • Works alongside Google’s existing security and privacy controls

So at a platform level, the foundations are solid.

The risk doesn’t usually come from the tools themselves — it comes from how they’re used.

Where GDPR Risks Can Creep In

Most GDPR issues around AI aren’t technical. They’re human.

Common problem areas include:

  • Staff pasting personal or sensitive information into prompts
  • AI tools being accessed outside approved school accounts
  • A lack of clarity around what is and isn’t appropriate use
  • Policies that haven’t kept pace with new technology

AI doesn’t remove accountability.
It shifts it.

And without guidance, people will fill the gaps themselves — often with good intentions, but inconsistent results.

What “Good” Looks Like in Schools

Schools that use AI safely and confidently tend to have a few things in common.

1. Clear expectations

Staff know:

  • What data should never be entered into AI tools
  • Which accounts and platforms are approved
  • When AI is helpful — and when it isn’t appropriate

2. Sensible policies (not scary ones)

This isn’t about banning tools or over-policing staff.
It’s about setting clear, practical boundaries that people can actually follow.

3. The right technical setup

That means:

  • Proper account management
  • Device controls
  • Access aligned to roles
  • Oversight without micromanagement

4. Ongoing conversations

AI isn’t a one-off decision.
The best schools review, adapt, and keep staff informed as tools evolve.

What This Means for Teachers

When AI is used well:

  • Planning takes less time
  • Admin pressure eases
  • Creativity has more room to breathe

When it’s unclear or risky:

  • Confidence drops
  • Staff worry about “doing the wrong thing”
  • Tools get avoided or misused

Good guidance doesn’t slow teachers down.
It gives them confidence to use AI properly.

What This Means for School Leaders

For leaders, AI governance is quickly becoming part of everyday responsibility — alongside safeguarding, data protection, and digital strategy.

Handled well, Copilot and Gemini can:

  • Support staff wellbeing
  • Reduce workload pressures
  • Align with GDPR and inspection expectations

Handled poorly, they can introduce uncertainty and risk.

The difference is rarely the technology.
It’s the setup, clarity, and support around it.

The Bottom Line

Copilot and Gemini can be GDPR compliant in schools.

But compliance doesn’t happen by accident.

It comes from:

  • The right configuration
  • Clear expectations
  • Confident, informed staff
  • Ongoing support

Get those pieces right, and AI becomes what it should be — a tool that frees up time, not a source of worry. And if you’re not sure where your school currently sits, that’s okay. Most schools are figuring this out right now.

You don’t have to do it alone.

Frameworks & Accreditations

CCS
BETT
ERA
ICT
GDPR
CCS
BETT
ERA
ICT
GDPR
COVENANT
CONFIDENT

Partners & Ecosystem

Microsoft
Google
Cyber
NAHT
Nasen
Pixl
EduKid
Microsoft
Google
Cyber
NAHT
Nasen
Pixl
EduKid