Seventy-six per cent of teachers report having received no AI training at all. Not insufficient training. No training. Three quarters of the profession are being asked to prepare students for an AI-shaped world with no formal preparation of their own.
This is not a failing of teachers. It is a failing of the system that supports them. The curriculum has not caught up. The CPD market has not caught up. And the expectation that teachers should figure this out independently, alongside everything else they are carrying, is unreasonable.
But the gap exists. And students are not waiting for it to close.
Why the gap is wider than it looks
The 76% figure captures formal training. The real picture is more nuanced.
Some teachers have taught themselves. They have experimented with ChatGPT, watched YouTube videos, read articles. They have a working understanding of generative AI and some confidence in using it. What most of them do not have is a structured understanding of AI literacy as a teaching domain. Knowing how to use AI is not the same as knowing how to teach students to think critically about it.
Other teachers have avoided AI entirely. They are not opposed to it. They are overwhelmed. They already have a full timetable, marking, pastoral duties, exam preparation, and parental communications. Adding "learn about AI" to the list feels like one more demand with no time allocated to meet it.
A smaller group actively distrusts AI in education. They have seen students use it to cheat. They have seen colleagues produce AI-generated reports that sound polished but say nothing. They associate AI with shortcuts, not with critical thinking. They are not wrong about the risks. But avoiding AI literacy because of those risks leaves students more vulnerable, not less.
All three groups need different things. The self-taught group needs structure and confidence that they are teaching the right concepts. The overwhelmed group needs something they can use immediately with minimal preparation. The sceptical group needs to see that AI literacy is about thinking, not technology worship.
What exists right now
The DfE published free AI training modules for all school staff in June 2025, developed by the Chiltern Learning Trust and the Chartered College of Teaching. Four modules plus a leadership toolkit.
Module 1 covers understanding generative AI. Module 2 covers interacting with it effectively. Module 3 covers safe use, and the DfE considers this essential for every staff member. Module 4 covers use cases across different roles. Each module takes about 30 minutes. The Chartered College of Teaching offers a free certified assessment alongside them.
These are good. They are free. They exist. And most teachers have not completed them.
The problem is not availability. It is the same problem that affects all CPD: time. A 30-minute module sounds manageable until you try to find 30 uninterrupted minutes in a teacher's week. Schools that have embedded these modules into INSET days or directed time have seen take-up. Schools that have emailed the link and hoped for the best have not.
For school leaders reading this, the single highest-impact action is to allocate one INSET session to Module 3 (safe use) and run it as a whole-staff exercise. Not optional. Not "when you get a chance." Protected time, with discussion afterwards. This costs nothing except 30 minutes of directed time, and it addresses the safeguarding dimension that governors and inspectors will increasingly ask about.
What CPD cannot do
No amount of training will make a Geography teacher feel like an AI expert. That is not the goal. The goal is to make a Geography teacher feel confident enough to run a ten-minute AI literacy activity within a Geography lesson.
The difference matters. We are not asking teachers to become AI specialists. We are asking them to apply the critical thinking skills they already teach to a new context. A Geography teacher who teaches students to evaluate data sources already has the intellectual framework for teaching students to evaluate AI-generated data. What they need is the bridge between their existing expertise and the AI literacy concept.
This is the design principle behind AILitKit. Every guide includes a Preparation section called "Understanding the AI Concepts." Each concept comes with two elements. "What you already know" connects the AI concept to something the teacher already does in their subject. It is one sentence designed to make the teacher think: I am closer to understanding this than I thought. "Go deeper" is an expandable section with 3 to 5 sentences of professional development, including one surprising fact or example. A teacher who reads the Go Deeper section has learned something genuinely interesting in under two minutes. A teacher who does not has still covered enough to run the activity confidently.
The guide does the bridging work. The teacher brings the subject expertise. Between them, the student gets an AI literacy activity that feels native to the lesson, not bolted on.
The confidence ladder
If you are a teacher reading this and feeling behind, here is a sequence that works.
First, complete DfE Module 3 on safe use. Thirty minutes. This gives you the safeguarding baseline every staff member needs. You can find the modules at gov.uk under "Using AI in education settings support materials."
Second, try one AILitKit Lesson guide for a lesson you are teaching this term. Read the Preparation section. Pick one activity. Use it. See what happens. You will discover that AI literacy is mostly about asking good questions, which is something you already do every day.
Third, talk to a colleague about what happened. The best CPD on AI literacy right now is not a course. It is a conversation between two teachers about what worked and what did not, after they each tried one activity.
Fourth, if you want to go further, generate a Topic guide for a full unit. This maps every lesson in the scheme and shows you where AI literacy connects naturally and where it honestly does not. It includes a department meeting agenda so you can share what you have found with your team in 15 minutes.
None of these steps require you to become an expert. All of them require you to start.
What leadership can do
For school leaders, the confidence gap is a structural problem with structural solutions.
Allocate time. One INSET slot for DfE Module 3 as a whole-staff exercise. One department meeting per term for sharing AI literacy practice. Time is the single biggest barrier teachers cite, and it is the one leaders can address directly.
Normalise not knowing. The fastest way to close the confidence gap is for a senior leader to say publicly: I am learning about this too. I tried something in my lesson and it did not go perfectly. Here is what I learned. Permission to be imperfect is more powerful than a training course.
Start small and make it visible. One teacher trying one activity and reporting back at a briefing has more impact than a policy document nobody reads. Make the early adopters visible. Let them describe what happened in a real classroom with real students.
Use the Whole Curriculum guide to map the picture. A Whole Curriculum guide audits your existing provision, identifies quick wins, and produces a governor briefing, an implementation timeline, and evidence statements for your self-evaluation form. It turns "we should probably do something about AI literacy" into "here is what we are doing, here is what we have covered, and here is what we recommend next."
The confidence gap is real. But it is smaller than it feels. Every teacher who has helped a student evaluate a source, question a statistic, or think critically about a persuasive text has already practised the skills that AI literacy requires. The context is new. The thinking is not.
Give AILitKit one lesson. Get a guide with activities, a preparation section that builds your confidence, discussion questions, and safeguarding notes. No AI expertise required. Try it free at ailitkit.com