Two things happened in the European Union in the past year that change how schools think about AI literacy. DigComp updated to version 3.0 in November 2025, integrating AI across all 21 digital competences for the first time. And the EU AI Act began enforcement, with Article 4 on AI literacy and Article 5 on prohibited practices in force since February 2025, and the remaining provisions applying from August 2026.
These are not distant policy developments. They create immediate obligations for schools in EU member states, and relevant context for international schools anywhere that follow EU-influenced curricula or serve families from EU countries.
What changed in DigComp 3.0
DigComp is the European Digital Competence Framework. Published by the European Commission's Joint Research Centre, it defines what digital competence means for citizens. Schools across Europe use it to structure digital literacy teaching. Many national curricula reference it directly.
Version 2.2 mentioned AI in some competences. Version 3.0, published in November 2025, integrates AI across all 21 competences in all five competence areas. This is not a separate AI section bolted onto the framework. It is AI woven through the entire structure.
The most visible change is that Competence Area 1 has been renamed from "Information and data literacy" to "Information search, evaluation and management." The shift in language reflects a world where searching for information increasingly means interacting with AI systems, not just typing keywords into a search engine.
For schools, the practical implication is that any lesson teaching digital competence now implicitly includes AI competence. A lesson about evaluating online sources is also a lesson about evaluating AI-generated content. A lesson about creating digital content is also a lesson about understanding what happens when AI assists in that creation.
The EU AI Act: what matters for schools
The EU AI Act is complex legislation. Most of it is about regulating companies that build and deploy AI systems. But several provisions are directly relevant to schools.
Article 4: AI Literacy Obligation. Every organisation in the EU that provides or uses AI systems must ensure its staff have sufficient AI literacy. This applies to schools. If your school uses AI tools for administration, marking support, or student-facing activities, you have a legal obligation to ensure your staff understand those systems well enough to use them appropriately. This has been in force since February 2025.
There is no specific fine for breaching Article 4 alone. But regulators will consider AI literacy compliance when investigating other breaches. The practical message is clear: schools deploying AI tools need staff who understand them.
Article 5: Prohibited Practices. Some AI uses are banned entirely. The one most relevant to schools is Article 5(1)(f): AI systems that infer emotions from biometric data in education are prohibited, except for medical or safety reasons. If your school uses any system that monitors student facial expressions, analyses voice tone for engagement, or tracks attention through webcam, that system is now illegal in the EU.
This provision is in force. It is not guidance. It is law. The maximum fine for prohibited practices is up to 35 million euros or 7 per cent of global turnover.
Text-based sentiment analysis of written content is generally not considered biometric emotion recognition and is not caught by this prohibition. The distinction is between systems that process biometric data (face, voice, physiological signals) and systems that process text.
Article 50: Transparency Obligations. From August 2026, AI systems that interact directly with people must identify themselves as AI. AI-generated content must be machine-detectable as artificial. Deepfakes must be disclosed. If schools use AI chatbots for student support, those chatbots must identify themselves as AI, not present as human. If students create AI-generated content as part of classwork, they should understand the transparency obligations that apply to that content in the real world.
Annex III: High-Risk AI in Education. The Act classifies certain AI systems used in education as high-risk: systems that determine school admissions, evaluate learning outcomes, assess appropriate education levels, or monitor student behaviour during tests. If a school uses an AI system for any of these purposes, that system must meet stringent requirements including risk management, data governance, human oversight, and accuracy standards. Students and parents have a right to an explanation of how the AI was involved in any decision that significantly affects them.
What this means for curriculum planning
For EU schools, the combination of DigComp 3.0 and the EU AI Act creates a strong case for embedding AI literacy across the curriculum, not just in Computing.
DigComp 3.0 provides the competence framework. The EU AI Act provides the legal context. Together they say: students need to understand AI because it is now integrated into every aspect of digital life, and because the law requires organisations (including schools) to ensure AI literacy.
A school that can demonstrate structured AI literacy provision is meeting both the educational expectations of DigComp 3.0 and the legal expectations of Article 4. A school that cannot demonstrate this is vulnerable on both fronts.
For subjects like Business Studies, Law, and Citizenship, the EU AI Act itself becomes curriculum content. Understanding how governments regulate AI is part of understanding how modern governance works. AILitKit's EU AI Act tag mapping covers 24 specific provisions across 6 areas of the Act, from the roles in the AI value chain (who is a provider, who is a deployer, who is an affected person) to the eight domains where AI is classified as high-risk.
What non-EU schools should know
If your school is outside the EU, the EU AI Act does not directly apply to you. But it matters for three reasons.
First, many international schools serve families who will live and work in the EU. Students who understand how AI is regulated in the world's largest single regulatory market have a competitive advantage.
Second, the EU AI Act is influencing regulation worldwide. The UK's approach to AI governance references EU thinking. The UAE's emphasis on AI ethics in its mandatory curriculum aligns with EU values. The principles in the Act are becoming global norms, even where the law does not apply.
Third, if your school uses AI tools built by companies that operate in the EU, those tools are likely designed to comply with the EU AI Act. The transparency features, content labelling, and safety guardrails you see in tools like ChatGPT, Gemini, and Copilot exist partly because of EU regulation. Understanding the Act helps you understand the tools.
The timeline schools should track
Already in force (since February 2025): Article 4, the AI literacy obligation. Your school should already be ensuring staff understand the AI tools they use. Article 5, prohibited practices. Emotion recognition AI in schools is already banned.
August 2026: Article 50, transparency obligations. AI chatbots must identify themselves. AI-generated content must be machine-detectable. Deepfakes must be disclosed. Annex III high-risk requirements. AI systems used for admissions, grading, and student monitoring must meet full compliance requirements.
Ongoing: DigComp 3.0 is a framework, not a law. There is no compliance deadline. But national curricula that reference DigComp will update to reflect version 3.0, and schools will be expected to align.
Where AILitKit fits
AILitKit tracks DigComp 3.0 and the EU AI Act as optional framework toggles. For EU schools, both are auto-enabled. For non-EU schools, they are available in settings.
Activities are tagged against DigComp 3.0 competence areas and EU AI Act provisions where relevant. A lesson about evaluating AI-generated text tags against DigComp Area 1 (Information search, evaluation and management) and against the EU AI Act's transparency obligations. A discussion about whether AI should make school admissions decisions tags against Annex III (High-Risk AI in Education) and OECD AILit's Design AI domain.
The governor report for EU schools references DigComp 3.0 and the EU AI Act alongside the international frameworks. For schools in member states with national curricula that reference DigComp, this provides evidence of alignment for inspections.
The frameworks do the mapping. Your teaching does the work. The two meet in the classroom, where a teacher asks a question about AI and a student starts thinking.
AILitKit maps activities to DigComp 3.0 and the EU AI Act alongside UNESCO, OECD AILit, PISA 2029, and Anthropic 4D. See all your framework coverage in one place. Try a free guide at ailitkit.com