The Problem With "Just Try It"
Most AI tool introductions in schools follow a predictable pattern: a vendor demonstration at a professional development day, followed by an email with login instructions, followed by low adoption, followed by a conclusion that teachers are resistant to change. This pattern misses what is actually happening.
Teacher resistance to AI is, in most cases, rational and grounded in legitimate concerns. Treating it as irrationality to be overcome through enthusiasm and repetition is both disrespectful to teachers and ineffective as a strategy. The school leaders who achieve genuine AI adoption start by taking the resistance seriously.
Why Teachers Resist AI: The Actual Reasons
Fear of job replacement: Teachers have watched automation displace workers in manufacturing, transportation, and customer service. The question "will AI replace teachers?" is not an irrational anxiety, it is a reasonable inference from observed patterns. Leaders who dismiss this concern as paranoia will not address it. Leaders who name it directly and make specific, credible commitments about the role of AI in their institution are better positioned to build trust.
Concerns about student misuse: Teachers who have spent years teaching students to think for themselves are correctly skeptical of tools that make thinking-for-yourself optional. A tool that writes student essays is not a teacher's ally; it is an adversary to their core pedagogical goals. This concern distinguishes between AI tools that support student learning (acceptable) and AI tools that bypass it (problematic). It deserves engagement, not dismissal.
Distrust of technology vendors: Most experienced teachers have seen three to five waves of "transformative" educational technology. Interactive whiteboards replaced chalkboards and then gathered dust. Learning management systems created compliance documentation burdens without improving learning. The skeptic who says "this too shall pass" has history on their side. Demonstrating that this technology is different requires evidence, not rhetoric.
Past experience with failed edtech initiatives: Beyond vendor skepticism, many teachers carry specific institutional wounds, a system that was implemented poorly, a tool that was mandated from above without training, a platform that was canceled mid-year after teachers had invested in learning it. These experiences are legitimate data that inform current caution.
What the Research Says
A 2024 RAND Corporation survey of US teachers found that 63% expressed concerns about AI tools in education. But the same survey found that among teachers who had actually used AI tools for at least one month, 71% reported time savings and 58% said they were likely to continue using them. The gap between pre-adoption concern and post-adoption experience is large and consistent across studies.
This finding matters for adoption strategy. The barrier for most reluctant teachers is not that AI tools are actually bad, it is that they have not had a low-risk opportunity to discover for themselves that the tools work. The change management task is creating that low-risk opportunity.
Change Management Principles for AI Adoption
Start with the pain point, not the tool. The worst introduction to an AI tool sounds like: "Here is this AI system you should try." The most effective introduction sounds like: "I know grading 120 essays takes you 8 hours. There is a tool that brings that to 3 hours. Can I show you?" The tool is the solution. Lead with the problem it solves.
This requires that school leaders know, specifically, which administrative burdens are most onerous for their teachers. This knowledge comes from asking, not assuming. A brief survey, what takes you the most time that is not direct instruction?, provides the targeting information needed to introduce AI tools at the point of maximum relevance.
Design for permission, not mandate. The most consistent failure mode in institutional technology adoption is mandate-first rollout. Mandatory use of a new tool is experienced as an addition to workload, not a reduction of it, until the tool is learned. Learning happens fastest when it is voluntary and self-directed. Start with open access and genuine encouragement rather than requirements and deadlines.
Pilot with volunteers and make them visible. Early adopters, the teachers who are curious and willing to experiment, are the most valuable adoption resource available to school leaders. Identify them, support them with extra training and access, and create structured opportunities for them to share their experiences with colleagues. Peer-to-peer recommendation is more credible to skeptical teachers than administrator enthusiasm.
The Three-Stage Adoption Curve
Understanding the stages of adoption helps leaders calibrate their expectations and interventions:
Stage 1: Skeptic. The teacher doubts the tool works, doubts its relevance to their specific teaching context, and expects it to add work rather than reduce it. At this stage, the leader's job is to create a low-stakes opportunity to be proven wrong, not to argue, not to mandate, but to invite. A specific, relevant use case ("try the lesson planner for just one lesson this week") is more effective than a general encouragement to explore.
Stage 2: Experimenter. The teacher has tried the tool once or twice and seen that it produces something useful. They are not yet committed to regular use, they may still believe the initial experience was lucky, or that the tool will break down for more complex tasks. At this stage, the leader's job is to celebrate small wins and provide specific suggestions for expanded use that builds on what already worked.
Stage 3: Advocate. The teacher is a regular user who has integrated specific AI tools into their regular workflow and can speak concretely about the time savings and quality improvements they experience. At this stage, the leader's job is to give them a platform, structured opportunities to share their experience with colleagues in Stages 1 and 2.
Specific Strategies That Work
Teacher-led discovery sessions: Rather than an administrator demonstrating AI tools, organize sessions where an early-adopter teacher demonstrates the specific workflow they use for a specific task. Peer demonstration is more credible and more relevant than vendor demonstrations.
Time-savings data: Ask teachers who are using AI tools to track their time for two weeks, time spent on lesson planning, grading, and communication before and after AI assistance. Concrete, personal data is more persuasive than vendor claims about efficiency.
Peer champions: Identify one AI-enthusiastic teacher per department or grade band. Give them additional support, training, and recognition. Their primary job is to be available to colleagues with questions, not to evangelize, but to help. Informal peer mentorship is more effective than formal training for tool adoption.
Explicit permission: For teachers concerned about using AI "correctly" or worried about institutional judgment, explicit written permission from leadership that AI tool use for lesson planning, feedback, and communication is not only allowed but encouraged removes an adoption barrier that is rarely acknowledged but frequently operative.
Red Lines to Respect
Genuine adoption requires genuine respect for legitimate limits. There are AI uses that teachers are right to resist:
AI should not be used to make final grade decisions without teacher review. Rubric-based AI scoring can be a first pass; it cannot be the final record.
AI should not be used to respond to student mental health disclosures, disciplinary situations, or any high-stakes individual communication where the teacher-student relationship is at stake.
AI should not be mandated for assessment of student work in ways that bypass teacher judgment about individual students' circumstances, growth, and context.
Leaders who articulate these red lines clearly, and hold to them, build the credibility needed to advocate for AI use in the areas where it genuinely helps. The goal is genuine adoption that improves teacher wellbeing and student outcomes, not compliance with a technology mandate. Those require very different approaches.