Not every school using AI is using it responsibly. Here are the five clear signs of a school that's got it right — and exactly what to ask if you're not sure.
Why this question matters more than it might seem
Most parents assume their school is doing the right thing. In most cases, they're right. But "using AI" covers a wide spectrum — from a headteacher with a carefully considered policy, trained staff and compliant tools, to a school where a few members of staff are quietly using personal ChatGPT accounts with no oversight and no data protection safeguards in place.
The difference matters. And as a parent, you have every right to know which end of that spectrum your school is closer to.
The five signs of a school using AI responsibly
1. They have an AI policy — and it's published
A school that is using AI responsibly will have a clear, governor-approved AI policy that is available on their website. It doesn't need to be long or complex. It needs to explain which tools are used, what data protection rules are in place, what staff can and can't use AI for, and what the school's position is on pupil-facing AI use.
If you can't find an AI policy on your school's website, that's the first thing worth raising with the headteacher.
2. Staff have been trained
A school using AI responsibly ensures that all staff — not just the tech-enthusiasts — understand the rules before using any AI tool. That means knowing which tools are approved, understanding that pupil data must never be entered into any AI system, and knowing what to do if something goes wrong.
Training doesn't need to be extensive. But it needs to have happened, and there should be a record of it.
3. Pupil data is protected — clearly and absolutely
The most important data protection rule in AI use is the simplest: no pupil names, no personal details, no SEND information, no medical or welfare data enters any AI tool. This should be an absolute rule with no exceptions, and staff should be able to tell you it confidently.
Schools using only approved, education-specific tools — such as Teachmate, Microsoft Copilot via school accounts, or Google Gemini via school Workspace accounts — have much stronger data protection than those using personal consumer accounts.
4. They're open with parents
A school with nothing to hide about its AI use will communicate openly about it. That means a letter or email to parents when they adopt AI, a page on their website explaining their approach, and a willingness to answer questions honestly. If a school gets defensive or evasive when parents ask about AI, that's a red flag.
5. There's someone accountable
Responsible AI use in a school has a clear owner — usually the headteacher, supported by the school's Data Protection Officer. If there's been no thought given to who is responsible for AI policy and practice, that's a sign the governance hasn't been properly considered.
The questions to ask directly
If you want to assess your school's AI approach quickly, these five questions will tell you most of what you need to know:
- "Do you have an AI policy? Can I see it?"
- "Which AI tools do staff use, and are they GDPR compliant?"
- "How do you ensure pupil personal data isn't entered into AI tools?"
- "Have all staff received training on safe AI use?"
- "Are any pupils using AI tools directly, and if so how are parents informed?"
A school with good answers to all five is in good shape. A school that struggles with any of them has work to do — and knowing that is the first step towards it getting done.
The key signal: Responsible AI use isn't about having the most sophisticated tools or the biggest investment. It's about thoughtfulness, transparency, and putting safeguards in place before they're needed rather than after something goes wrong. Schools that do this well tend to be the same ones that communicate well with parents generally.
AskColin can review your school's AI approach
If you're a headteacher or school leader reading this and you're not sure where your school sits on this spectrum, we can help. AskColin offers a free initial conversation to review your current AI approach — what's working, what's missing, and what you'd need to do to get fully compliant and confident.
That includes reviewing whether you have the right policies in place, whether staff are trained appropriately, and whether the tools being used carry the right data protection credentials. We'll tell you honestly what you have, what you're missing, and how to close the gap — with no obligation to take it any further.
If you're a parent who wishes your school would do this, you're very welcome to share this page with your headteacher. The best schools actively want to know how they're doing — and an honest external review is one of the most useful things a school can do before AI becomes a bigger part of how they operate.
Get in touch for a free AI review conversation →
Know a school that could benefit?
If you're a parent, share this with your school — let them know about AskColin. If you're from a school and want to find out more, we'd love to hear from you.
Drop us a note → See what we offerIs your school ready for AI?
AskColin helps UK primary schools adopt AI safely — with the training, tools and compliance documents to do it properly.
Get in touch — it starts with a free chat