For School Principals

Protecting Student Judgement While Using AI in Your School

You face a specific problem: AI tools like ChatGPT and Khanmigo arrive in your school before you have clarity on what cognitive skills you are actually trying to develop. Teacherss get training on how to use these tools but not on when they should be off-limits. Your assessment data starts looking impressive while student judgement quietly deteriorates. The gap between policy and pedagogy is where real damage happens.

These are suggestions. Your situation will differ. Use what is useful.

Download printable PDF

Map Your School's Core Cognitive Skills Before Adopting Any Tool

Before Turnitin AI flags student work or Microsoft Copilot starts drafting essays, decide what thinking your school refuses to automate. This is not a compliance exercise. List the specific judgements you want students to make independently: deciding which sources matter, recognising when they do not understand something, choosing how to structure an argument. Once you have this map, every AI tool decision becomes measurable against actual learning goals, not against efficiency.

Build Teachers Professional Development That Addresses Identity, Not Just Buttons

Teacherss receiving training on Khanmigo or Copilot often feel sidelined rather than supported. They see the tool as doing what they do, which erodes professional identity and autonomy. Your development sessions must address the real anxiety: What is a teacher for if the software explains the concept? Frame AI as expanding teacher work, not replacing it. Teacherss become assessment designers, judgement coaches, and intervention specialists who know which students need the tool and which students need struggle.

Design Assessment Practices That Reveal Real Learning, Not AI-Assisted Output

Your assessment data can become dangerously misleading when students submit AI-generated work or use AI throughout the learning process. You will see improved grades and completion rates while actual student judgement stalls. Instead of relying on final products, build assessment that captures thinking in progress. Use low-stakes quizzes that happen before AI touches the work. Observe student behaviour during problem-solving. Ask students to explain their thinking out loud where the tool cannot help them.

Protect Wellbeing by Setting Clear Boundaries on Tool Use

Students using ChatGPT or Copilot for every difficulty develop a dependency that looks like confidence but is actually learned helplessness. They stop trying difficult things because the tool is faster. Your policy needs to name this problem directly and set boundaries that protect student willingness to struggle. Frame these boundaries as wellbeing decisions, not restrictions. Students who never experience productive struggle lose resilience and the ability to trust their own thinking. Your school's role is to preserve that capacity.

Create a Decision-Making Process for New AI Tools That Tests Pedagogy, Not Just Capability

Every time a vendor offers a new AI product, your staff will push you to pilot it. Without a clear decision process, adoption spreads based on enthusiasm rather than learning outcomes. Build a simple three-question evaluation that any teacher can use before recommending a tool. The questions focus on pedagogy, not features. First: What cognitive skill does this tool replace, and is that a skill we want students to keep? Second: What new learning does this tool enable that was impossible before? Third: How will we know if students are developing the skill we care about, or just getting better at using the tool?

Key principles

  1. 1.Decide what skills your school will not automate before you adopt any tool.
  2. 2.Teachers professional development must address identity and autonomy, not just feature training.
  3. 3.Assessment that captures thinking in progress reveals real learning far better than final products when AI is involved.
  4. 4.Protect student wellbeing by setting boundaries that preserve productive struggle and intellectual resilience.
  5. 5.Evaluate new AI tools against pedagogy first and capability second.

Key reminders

Related reads

The Book — Out Now

Cognitive Sovereignty: How To Think For Yourself When AI Thinks For You

Read the first chapter free.

No spam. Unsubscribe anytime.