All posts
AI in Education15 min readDecember 9, 2025

The School Leader's Guide to AI in 2026: What You Actually Need to Know

AI adoption in schools has exploded—92% of students now use it. Here's the comprehensive guide to navigating the opportunities, risks, and decisions that will define your school's approach to AI.

Two years ago, when ChatGPT launched, school leaders faced a choice: ban it, ignore it, or figure it out. Most chose some combination of the first two.

That window has closed.

RAND's September 2025 study surveyed over 16,000 students, parents, teachers, and administrators across the United States. The findings are unambiguous: 54% of students and 53% of teachers now use AI for school—increases of more than 15 percentage points in just two years. At the university level, the shift is even more dramatic: student AI usage jumped from 66% in 2024 to 92% in 2025.

The tools are in your building. The question is whether you're leading the conversation or being led by it.

The Current Reality: Faster Than Policy Can Move

Here's what the data tells us about where we actually stand.

Students are using AI whether you have a policy or not. According to RAND, over 80% of students report that teachers never explicitly taught them how to use AI for schoolwork. Only 35% of district leaders provide students with any AI training. Meanwhile, 88% of students now use generative AI specifically for assessments—up from 53% just one year earlier.

Teachers are adopting AI faster than training can keep up. Cengage Group's 2025 research found that 63% of K-12 teachers have incorporated generative AI into their teaching—a 12% year-over-year increase. Yet 68% of urban teachers report receiving no AI training whatsoever. Teachers are figuring it out alone, making decisions about academic integrity, assessment validity, and instructional design without guidance.

Policy is lagging badly. Only 45% of principals report having school or district policies on AI use. According to UNESCO, just 10% of the 450+ schools and universities they surveyed have established formal AI guidelines. 28 states have now published AI guidance for K-12 education, but most offer recommendations rather than requirements. Ohio and Tennessee are among the few states that actually mandate schools adopt AI policies.

⚠️ The guidance gap

Your students are using AI. Your teachers are using AI. But over half of your schools have no formal policy governing how they should use it. This gap isn't just an oversight—it's a liability.

The Five Challenges Every School Must Address

After working with schools navigating this transition—and leading one through it myself—I've identified five challenges that every school leader must confront. These aren't theoretical concerns. They're showing up in your hallways right now.

1. The Detection Problem Is Over

For two years, schools have been locked in an arms race with AI detection tools. It's time to acknowledge that this approach has failed.

Turnitin and other detection tools generate false positives that can devastate innocent students. RAND's research found that students worry about "false accusations of cheating"—and they're right to worry. Detection tools are particularly unreliable for non-native English speakers and students with certain learning differences.

More fundamentally, detection treats AI as a problem to catch rather than a tool to integrate. The most effective schools have shifted from "how do we detect AI use?" to "how do we design assessments where AI use is either irrelevant or transparent?"

Deep dive: Why the AI Detection Arms Race Is Already Over

2. The Teacher Training Crisis

The numbers are stark. 68% of teachers have received no AI training. Yet teachers who use AI tools at least weekly save an average of 5.9 hours per week—roughly six extra weeks of reclaimed time across a school year. Teachers who aren't trained aren't just missing efficiency gains; they're making uninformed decisions about academic integrity, data privacy, and instructional design every day.

Traditional professional development doesn't work for AI. One-day workshops on specific tools become obsolete within months. What teachers need is ongoing, practice-embedded learning that helps them develop judgment about AI rather than just familiarity with current applications.

Deep dive: 68% of Teachers Have Zero AI Training. What That Means for Your School.

3. The Safety Crisis: Deepfakes in Your Schools

This is the challenge that keeps me up at night.

The National Center for Missing and Exploited Children reports that AI-generated child sexual abuse images submitted to their CyberTipline soared from 4,700 in 2023 to 440,000 in just the first six months of 2025. That's not a typo. A nearly 100-fold increase in 18 months.

RAND's research on deepfakes in K-12 schools found that 13% of principals reported incidents of bullying involving AI-generated deepfakes during the 2023-2024 and 2024-2025 school years. For middle and high schools, that number rises to one in five principals.

"Nudify" apps—tools that digitally remove clothing from photos—have made it trivially easy for students to create explicit images of classmates. At least half the states enacted legislation addressing deepfakes in 2025, but most school policies haven't caught up. Students don't realize the severity of what they're doing. Teachers don't know how to respond. And victims are left traumatized.

This isn't hypothetical. It's happening in schools like yours right now.

Deep dive: The Deepfake Crisis Coming to Your School

4. The Policy Vacuum

If your school doesn't have a comprehensive AI policy, you're operating on borrowed time. But what does "comprehensive" actually mean?

The best state guidance documents—from Georgia, North Carolina, and Ohio—address multiple dimensions:

  • Academic integrity frameworks addressing plagiarism and citation requirements
  • Data privacy and security protocols aligned with FERPA and COPPA
  • Implementation guidance for policy development, vendor vetting, and procurement
  • AI literacy programs with curated resources for educators and students
  • Risk management addressing bias, misinformation, and AI limitations

Generic policies that simply ban or permit AI aren't enough. You need guidance that helps teachers make contextual decisions: What's allowed on this assignment? What disclosure is required? How do we assess learning when AI can generate the output?

Deep dive: Creating an AI Policy That Actually Works

5. The Parent Communication Challenge

Parents are confused. Some want AI banned entirely—they've read the headlines about cheating and see AI as a threat to their children's education. Others are frustrated that their kids aren't learning AI skills that seem essential for the future workforce.

Both groups need clear, substantive communication from you. Not marketing language. Not defensive posturing. Honest engagement with the trade-offs.

Deep dive: How to Talk to Parents About AI

The Opportunities Worth Pursuing

The risks are real. But so are the opportunities. Ignoring AI doesn't protect your students—it leaves them unprepared for a world that's already here.

The Tutoring Revolution

A June 2025 randomized controlled trial from Harvard University, published in Scientific Reports, found that students using a custom-designed AI tutor learned more than twice as much in less time compared to students in active learning classrooms led by experienced human instructors. They also felt more engaged and more motivated.

This is landmark research. The study compared AI tutoring not to lectures but to active learning—the research-backed, best-practice pedagogy that has been demonstrated to substantially outperform traditional instruction. And AI tutoring still won.

The key caveat: design matters enormously. The Harvard team designed their AI tutor using research-based pedagogical principles. It didn't just give answers; it prompted students to think, provided scaffolded hints, and encouraged productive struggle. Unguided use of ChatGPT, by contrast, has been shown to harm learning outcomes when students use it to bypass thinking rather than support it.

The implication for schools: AI tutoring has enormous potential, but only when deliberately designed and appropriately scaffolded.

Deep dive: What the Research Actually Says About AI Tutoring

The Time Savings That Change Teaching

Teachers are drowning. AI can help.

HMH's educator survey found that 68% of teachers using AI report saving up to five hours per week. The most impactful applications aren't pedagogical—they're administrative. Drafting parent communications. Creating differentiated materials. Summarizing meeting notes. Generating quiz questions.

When teachers spend less time on administrative tasks, they have more time for what actually matters: building relationships with students, providing individualized feedback, and designing meaningful learning experiences.

The Accessibility Revolution

For students with disabilities, AI is already transformative. Text-to-speech tools help students with dyslexia access content they couldn't read independently. AI-powered translation breaks language barriers for English learners. Adaptive platforms adjust difficulty in real time based on student performance.

The U.S. Department of Education now explicitly encourages schools to consider AI tools for students with disabilities. For some students, these tools aren't just helpful—they're essential.

The Risks You Cannot Ignore

Opportunity and risk are inseparable. Here's what concerns me most.

The Cognitive Offloading Problem

A January 2025 study published in Societies found a significant negative correlation between frequent AI use and critical thinking abilities. The mechanism is cognitive offloading: when AI does the thinking, students stop doing it themselves.

The study found that younger participants exhibited higher dependence on AI tools and lower critical thinking scores compared to older participants. Harvard faculty have raised similar concerns: "If AI is doing your thinking for you, whether it's through auto-complete or whether it's in some more sophisticated ways... that is undercutting your critical thinking and your creativity."

This is not a reason to ban AI. It's a reason to be intentional about how it's used. The same AI that can shortcut learning can also scaffold it—but only if teachers know the difference.

Deep dive: Is AI Making Students Worse at Thinking?

The Data Privacy Minefield

Schools now use an average of 1,449 different EdTech tools, each potentially accessing sensitive student information. AI tools add new complications: student data may be used to train models, conversations with AI chatbots may be stored indefinitely, and the outputs of AI systems may reveal information about students in unexpected ways.

FERPA, COPPA, and state privacy laws provide a baseline—but many state AI guidance documents note that existing regulations weren't written with generative AI in mind. Schools need robust vendor vetting processes, clear data governance policies, and ongoing monitoring of how student data is being used.

Deep dive: AI and Student Data: What School Leaders Need to Know

The Equity Divide

AI has the potential to democratize access to high-quality tutoring and personalized learning. It also has the potential to deepen existing inequalities.

NPR reported on the emerging "AI divide"—well-resourced schools are integrating AI strategically while under-resourced schools lack the infrastructure, training, and support to use it effectively. A teacher survey found that teachers in higher-income schools were more likely to teach themselves about AI and more likely to use it in ways that enhance learning.

Meanwhile, consider the Alpha School in Austin, Texas, where students pay $40,000 in annual tuition for an AI-first education with personalized AI tutors, eye-tracking technology, and gamified lessons. The contrast with underfunded public schools is stark.

If AI becomes another tool that advantages the already-advantaged, we will have failed our students.

A Framework for School Leaders

How do you actually lead through this? Here's the framework I've developed.

Start With Clarity About Your Values

Before you write a single policy, answer these questions:

  • What is school for? If it's about developing critical thinking, creativity, and human judgment, your AI approach needs to protect and develop those capacities.
  • What's the role of productive struggle? Learning often requires difficulty. AI can eliminate difficulty—sometimes that's appropriate, sometimes it undermines the entire point.
  • What do we owe our students? They're entering a world where AI is ubiquitous. Do we prepare them by banning it, or by teaching them to use it wisely?

Your policies should flow from your values. Without that foundation, you'll end up with rules that feel arbitrary to teachers and students.

Build Your Own Fluency First

You cannot lead what you don't understand.

Harvard Business Review's research analyzed over 34 million managerial job postings and identified five critical skills leaders need in the AI age. None of them are technical. They're all about judgment, orchestration, and organizational design.

But judgment requires experience. You need to use AI yourself—not just read about it—before you can make informed decisions. Spend a month using AI for your own work: drafting communications, summarizing documents, generating first drafts of policies. You'll quickly learn what it does well and where it falls short.

Create a Coherent Policy Stack

Your school needs policies at multiple levels:

School-wide AI policy: What's our overall stance? What are our values? How do we think about AI in education?

Academic integrity policy: What constitutes appropriate vs. inappropriate AI use? What disclosure is required? How do consequences scale?

Assessment guidelines by context: Different assessments have different purposes. A take-home essay and an in-class exam require different AI policies.

Data governance policy: How do we vet AI vendors? What data can be used? What's our process for ongoing monitoring?

Safety and wellbeing policy: How do we address AI-generated harm? What's our response protocol for deepfake incidents?

These policies need to work together and be clearly communicated to teachers, students, and parents.

Invest in Teacher Capacity

This is where most schools underinvest.

Teachers need more than a workshop. They need ongoing learning communities where they can share experiments, analyze results, and problem-solve together. They need time to use AI for their own work before they're expected to guide students. They need scenarios that help them develop judgment, not just rules to follow.

Identify your early adopters—the teachers already experimenting with AI—and create structures for them to share what they're learning. Peer-led learning scales better than expert-delivered workshops and builds internal capacity.

Communicate Proactively With Parents

Don't wait for parents to come to you with concerns. Get ahead of the conversation.

Explain your approach. Acknowledge the risks and how you're addressing them. Describe the opportunities and why they matter. Give parents concrete guidance on how to support AI literacy at home.

Parents who understand your thinking are more likely to support it—even when they have questions.

What Happens If You Don't Engage

I want to be direct about the stakes.

You lose control of the narrative. Students and teachers will use AI whether you have guidance or not. Without informed leadership, that use will be haphazard, inconsistent, and probably problematic. You'll be reacting to crises instead of shaping culture.

You miss efficiency gains that could reduce burnout. Teachers using AI effectively report saving hours each week. For a profession experiencing a retention crisis, that's not trivial.

You can't evaluate what you don't understand. Vendors will pitch AI products. Teachers will propose AI-enhanced lessons. Board members will ask about your AI strategy. Without enough knowledge to ask good questions, you're making decisions blind.

You become the bottleneck. When you don't understand AI, you slow down the people who do. Your hesitation becomes the constraint on your school's ability to adapt.

Your students graduate unprepared. ChatGPT (60%) and prompt engineering (38%) are now the most-added skills on LinkedIn. Whether we like it or not, AI fluency is becoming a workforce expectation. Schools that ban AI entirely don't protect students—they disadvantage them.

Where to Start

If you're reading this and feeling overwhelmed, here's a realistic path forward.

This week: Use an AI tool yourself—Claude, ChatGPT, whatever's available—for one real task. Draft an email. Summarize a document. Generate questions for a staff meeting. Notice what it does well and what it misses.

This month: Have three conversations. Talk to a teacher who's experimenting with AI, one who's skeptical, and one parent who's concerned. Don't advocate—just listen. Their perspectives will sharpen your thinking.

This quarter: Audit your current state. What policies exist? What's actually happening in classrooms? What training has been provided? Where are the biggest gaps between practice and guidance?

This semester: Create or update your AI policy framework. Not a perfect policy—a working one that you can iterate on. Get it in front of teachers and start building shared understanding.

Ongoing: Make AI a standing topic in leadership discussions. This isn't a problem to solve once and forget. It's an ongoing leadership challenge that will evolve as the technology evolves.


The schools that thrive in the next decade won't be the ones that banned AI or the ones that embraced it uncritically. They'll be the ones that engaged thoughtfully—that protected what matters about human learning while preparing students for a world that's already here.

That's the work ahead of us.


If you're a school leader navigating AI implementation—or trying to figure out where to start—let's talk. This is exactly the kind of challenge I help leaders work through.


References

  1. AI Use in Schools Is Quickly Increasing but Guidance Lags Behind - RAND Corporation
  2. AI in Education Report: GenAI Adoption in K12 & Higher Education - Cengage Group
  3. 75 AI in Education Statistics 2026 - DemandSage
  4. How States Are Responding to the Rise of AI in Education - Education Commission of the States
  5. AI tutoring outperforms in-class active learning: an RCT - Scientific Reports / Nature
  6. The Deepfake Dilemma: New Challenges Protecting Students - National Center for Missing & Exploited Children
  7. Artificially Intelligent Bullies: Dealing with Deepfakes in K–12 Schools - RAND Corporation
  8. AI Tools in Society: Impacts on Cognitive Offloading and Critical Thinking - Societies / MDPI
  9. Is AI dulling our minds? - Harvard Gazette
  10. State AI Guidance for Education - AI for Education
  11. U.S. Department of Education Issues Guidance on AI Use in Schools - U.S. Department of Education
  12. An AI divide is growing in schools - NPR
  13. 5 Critical Skills Leaders Need in the Age of AI - Harvard Business Review

Frequently Asked Questions

Should we ban AI in our school?

Banning AI is not a viable long-term strategy. 92% of students are already using it, and the percentage will only increase. Bans push AI use underground, prevent students from learning to use it responsibly, and leave them unprepared for a world where AI is ubiquitous. The question isn't whether to allow AI—it's how to guide its use intentionally.

What should our AI policy actually say?

Effective policies address multiple dimensions: academic integrity (what's allowed, what disclosure is required), data privacy (how we vet vendors, what data can be used), safety (how we respond to AI-generated harm), and instructional guidance (how teachers make contextual decisions). Generic policies that simply permit or ban AI aren't sufficient—you need guidance that helps people make good decisions in specific situations.

How do we train teachers who are already overwhelmed?

Start with time savings, not pedagogy. Teachers who see AI save them 30 minutes on parent emails are more receptive to exploring classroom applications. Use early adopters as peer trainers. Focus on frameworks and judgment, not specific tools that will become outdated. Build AI into existing PD structures rather than treating it as a separate initiative.

How do I talk to parents who want AI banned entirely?

Listen first. Their concerns—about cheating, about cognitive development, about their children's futures—are legitimate. Share what you're doing to address those concerns. Explain why you believe thoughtful integration serves students better than prohibition. Give them concrete guidance on how to support AI literacy at home. Parents who understand your reasoning are more likely to support your approach.

What about students who don't have access to AI tools at home?

The equity divide is real and concerning. Schools should ensure students have access to AI tools in school, even if they don't have them at home. Consider providing access to premium AI tools (Claude, GPT-4) rather than just free tiers. Build AI literacy into the curriculum so all students develop these skills, regardless of home resources. Monitor whether AI is widening or narrowing achievement gaps in your school.

Benedict Rinne

Benedict Rinne, M.Ed.

Founder of KAIAK. Helping international school leaders simplify operations with AI. Connect on LinkedIn

Want help building systems like this?

I help school leaders automate the chaos and get their time back.