At some point this year, a governor is going to ask your school what it is doing about AI. It might be phrased as a question about safeguarding. It might come up during a curriculum committee meeting. It might be prompted by a news story about deepfakes or a parent complaint about students using ChatGPT for homework.
When that question arrives, you need an answer that is honest, specific, and proportionate to what your school has actually done. Not a slide deck full of aspirations. Not a vague commitment to "explore AI across the curriculum." Something concrete.
This post describes what governors need to hear, what they do not need to hear, and how to structure a report that serves its purpose without overstating or understating your provision.
What governors actually want to know
We tested AI literacy governor reports against a panel of fictional governors representing the range you find on a real governing body: an accountant who wants numbers and risk, a parent governor who works in compliance, a business owner who wants targets and pace, and a former headteacher who cares about the children.
Ten questions surfaced consistently.
What are we actually doing? Not what we plan to do. What has happened in classrooms this term. Which students were involved. What activities they did. A governor who reads "students explored AI concepts" learns nothing. A governor who reads "Year 8 students debated whether an AI diagnostic tool should override a human doctor's judgement" can picture the lesson.
How many students does this reach? Scale matters. One guide reaching 30 students in one subject is an honest start. It is not "embedded provision." Governors need the numbers to understand proportion.
Is it any good? Are activities well designed? Do teachers feel confident delivering them? Have students responded well? If you do not know, say so, and explain how you plan to find out.
Are we meeting standards? Which frameworks are you aligning to? How many of their domains have you introduced? Governors do not need to know what each domain means. They need to know you are tracking progress against recognised benchmarks. State the fraction: "14 of 28 domains introduced across four subjects." That is a number a governor can track term to term.
Are the children safe? This is non-negotiable. The safeguarding line must be in the first section of the report, not buried later. Be specific. "All activities are unplugged and require no AI tools" is clear. "Students used Google Workspace AI tools via school-managed accounts with staff supervision" is clear. "Activities use a mix of approaches" is not clear enough.
What about students with additional needs? One sentence confirming that AI literacy activities include adaptations for SEND students (or Students of Determination in UAE schools). If they do not, flag it as a gap and say what you plan to do about it.
What have we not done yet? Name the gaps honestly. A school with provision in three subjects has gaps in the other departments. Say which ones. Say which you recommend addressing next and why.
What should we do next? Direction of travel. Specific, actionable. "Extend to KS4 Science and English next term" is actionable. "Continue to develop our provision" is not.
Is this consistent? Across departments. Across key stages. Across schools in a trust. If one department has done extensive work and three have done nothing, say so.
Can I track this? A governor needs a number they can compare term to term. "Key metrics this term: 5 guides across 4 subjects and 2 key stages, with 14 of 28 framework domains at introduced level or above." Put this at the end of the overview, every time, in the same format.
What governors do not need
Framework jargon without explanation. "OECD AILit Domain 2: Create with AI" means nothing to a governor who has never seen the framework. If you name a domain, add a plain-English gloss: "Create with AI (using AI tools to generate and refine work while maintaining control of quality)."
Every domain listed individually. If you track 28 domains, do not list all 28. Group them by theme: "Coverage is strongest in the ethical and critical thinking domains. Technical understanding of how AI systems work remains a gap." Governors need the pattern, not the list.
Overstated coverage. If you have produced one guide, do not claim you have "addressed" 11 domains. One guide introduces concepts. It does not embed them. Use honest depth language: Introduced (students encountered the concept), Practised (students engaged through a structured activity), Well Established (concept appears across multiple subjects and key stages with varied activities).
Commitments the school has not made. The report advises. It does not promise. "The school should consider extending to KS4 Science" is appropriate. "We will deliver 10 additional guides by Easter" is a commitment the governing body will hold you to.
How to structure it
Six sections work for any school at any stage.
Overview. Facts: guide count, subjects, key stages, approximate student reach. Safeguarding line: what tools were used and how they are governed. SEND line: are activities accessible? Key metrics line at the end for termly tracking.
What students experienced. Describe specific activities in plain language. Name activity types. Mention year groups. A governor should be able to read this section and picture what happened in a classroom.
Framework alignment. One sentence explaining why frameworks matter. State domain coverage as a fraction with depth qualifiers. Group by theme. Add plain-English glosses for any framework term.
What has been covered so far. Describe patterns. Cross-curricular connections. Strengths. Where multiple subjects reinforce the same AI concepts, note it. That overlap is evidence of coherent provision, not duplication.
Gaps and direction of travel. Name the gaps. Pair every gap with a recommendation. Scale the ambition to the school's stage. Early-stage schools should target one or two new subjects. Established schools should target quality audits and vocabulary consistency.
Key stage coverage. Provision per key stage with guide counts. Identify the thinnest key stage and recommend how to strengthen it.
Tone: proportionate to reality
The tone of the report must match what the school has actually done.
A school with one guide is at the beginning. The report should be honest about that without being apologetic. "The school has produced one AI literacy guide. This represents the very beginning of our work." Frame it as a deliberate starting point. Recommend where to go next.
A school with seven guides across three subjects is building something. Acknowledge the patterns forming. Note where departments are developing complementary provision. Recommend filling specific gaps.
A school with twenty guides across six subjects and three key stages has developing provision. Highlight cross-curricular connections. Recommend quality audits and vocabulary consistency across departments. The report should feel confident and self-critical in equal measure.
A school with forty guides across a trust has established provision. Focus on consistency across schools, formal progression mapping, and the shift from breadth to depth.
Never use the same framing for one guide as for forty. The report must feel proportionate.
What AILitKit generates
AILitKit generates the governor report from your school's coverage data. It reads how many guides you have produced, which subjects and key stages they cover, which framework domains have been introduced, practised, or well established, and what activities students experienced.
The report adapts its tone to your school's stage. It includes the safeguarding line, the SEND line, the framework alignment with plain-English glosses, the gaps with recommendations, and the key metrics line for tracking. For UK schools, it references Ofsted and DfE guidance. For Dubai schools, it references KHDA and the UAE National AI Strategy. For Abu Dhabi schools, it references ADEK. The regional context is built in.
You can copy the report, review it, and share it with your governing body. The report always includes a note at the bottom: "This report was generated by AI based on your coverage data. Review before sharing." Because a governor report about AI literacy that has not been reviewed by a human would be ironic.
The first report is the hardest
Your first governor report will be thin. It will show one or two guides, a handful of framework domains, and a long list of gaps. That is fine. The purpose of the first report is not to show that everything is in place. It is to show that you have started, you are tracking progress, and you have a direction.
Governors respect honesty and trajectory more than completeness. A school that says "we have done one guide, here is what students experienced, here is what we plan next" is in a stronger position than a school that says nothing and waits for someone to ask.
Start the report now. Even if the content is modest. The second report will show movement. The third will show a pattern. That is how governing bodies track progress.
AILitKit generates governor reports from your coverage data. Adapted to your school's stage, with framework alignment, safeguarding, and recommendations. Review, share with your board, and track progress term to term. Try it free at ailitkit.com