A teacher comes back from a conference raving about an AI tool. A governor sends a link to something they read about in the Times Educational Supplement. A sales email lands in the headteacher's inbox promising to transform staff workload. The number of AI tools being pitched at schools is growing fast — and not all of them deserve a yes. Here are five questions worth asking before anything gets adopted.

Why this matters more in schools than in most organisations

Most organisations that adopt a new software tool are primarily managing a business risk. Schools are managing something more significant: the safety and wellbeing of children, and the trust of the families who have placed their children in the school's care. That doesn't mean being paralysed by caution — it means being thoughtful about what you adopt and why.

The good news is that a small number of questions will filter out most of the tools that aren't appropriate. You don't need to be a data protection expert or a technology specialist to ask them. You just need to ask them consistently, for every tool, every time.

Question 1: Is it GDPR compliant — and can they prove it?

Any AI tool that will be used in a school context needs to be GDPR compliant. Not "we take privacy seriously" compliant — actually, demonstrably compliant, with a Data Processing Agreement (DPA) available on request. If a tool doesn't have a DPA, or if the company can't tell you clearly where your data is stored and how it's handled, that's a serious red flag. The ICO's guidance on AI and data protection is a useful reference point — it's available on their website.

Question 2: Does it use your data to train its AI models?

This is a separate question from GDPR compliance, and it's an important one. Some AI tools are GDPR compliant but still use your inputs — the prompts you type, the documents you upload — to improve their AI models. For a school, that means the content your staff produce could be feeding into a commercial AI training process. Tools built specifically for education, like Teachmate, explicitly don't do this. Always check the privacy policy for the phrase "we do not train our models on your data" — or equivalent. If it's not clearly stated, ask directly.

Question 3: Is there an education-specific version or accreditation?

There's a significant difference between a general AI tool that a teacher is using for school work and a tool that has been specifically designed, reviewed and accredited for educational use. The DfE's AI safety standards provide a framework — tools like Teachmate that meet these standards have been through a level of scrutiny that general tools haven't. The DfE's guidance on generative AI in education is worth reading in full if you haven't already. CyberEssentials certification is another meaningful signal — it indicates the tool has met a government-backed cybersecurity standard.

Question 4: What happens if something goes wrong?

This is the question most people forget to ask until they need the answer. If a member of staff accidentally enters pupil data into the tool, what happens? Is there a way to delete it? Who do you contact? Is there a support team — a real one, with a phone number or a response SLA — or just a help centre full of FAQs? A tool that can't answer these questions clearly is a tool that will leave you exposed if something goes wrong. And in schools, where the data involved is often sensitive, something will eventually go wrong. Having a clear incident response path before you need it is not pessimism — it's good governance.

Question 5: Will staff actually use it — and do they have the support to do so safely?

The most GDPR-compliant, education-accredited, CyberEssentials-certified tool in the world is useless if staff don't use it — or worse, dangerous if they use it without understanding the rules. Before adopting any AI tool, it's worth asking honestly: have we thought about how we'll introduce this to staff? Do people know what they can and can't put into it? Is there a clear safe-use guide they can refer to? Adoption without training is where most school AI stories go wrong. The tool isn't the hard part — the culture around it is.

A framework, not a barrier

These five questions aren't designed to make AI adoption harder — they're designed to make it easier to say yes with confidence. A tool that answers all five clearly is a tool you can adopt without anxiety, communicate to parents honestly, and describe to an Ofsted inspector without hesitation.

If you're working through this for the first time in your school, it's also worth having an AI policy in place before any tools are adopted — so that the answers to these questions are embedded in how your school operates, not just in the headteacher's head.

The tool that passes all five

For what it's worth, the tools we recommend to schools — Teachmate, Microsoft Copilot via school accounts, and Google Gemini via school Google Workspace — all pass this framework. They're GDPR compliant, they don't train on your data, they have education-specific versions, they have clear support and incident processes, and they're mainstream enough that staff training is straightforward.

That's not to say other tools aren't worth exploring. But starting with a shortlist that already passes the five questions is a much better position than starting from scratch with every new thing that lands in your inbox.

Want help working through this for your school?

We help primary schools adopt AI safely — from choosing the right tools to getting the policies, training and guidelines in place so everyone can use AI with confidence.

Get in touch