For Teacherss and Educators
Teacherss are using AI to save time on lesson planning and assessment, but often in ways that erode the very skills students need to develop. The mistakes happen quietly: a ChatGPT-generated explanation feels clear, so you use it; Khanmigo answers a student's question directly, so the struggle stops; your own creative problem-solving atrophies because Magic School AI generates differentiated activities in seconds.
These are observations, not criticism. Recognising the pattern is the first step.
When a student submits writing from ChatGPT or uses Diffit to rewrite their essay, you may judge the final product as competent without knowing whether the student can think through the problem independently. The work looks polished, so comprehension feels confirmed, but it is not.
The fix
Follow submitted work with a verbal check: ask the student to explain one key idea in their own words, or to apply the concept to a new example on the spot.
Khanmigo can identify a student's misconception and explain it clearly, which feels efficient. You then skip the step of listening to how your student arrived at their wrong answer, missing the specific thinking pattern you need to address.
The fix
When a student gets something wrong, ask them how they worked it out before offering any explanation or tool suggestion.
Magic School AI and similar tools create assessments quickly, but they sometimes ask questions that are too easy, too vague, or misaligned with what you actually taught. You may use the quiz without realising it does not measure what you intended.
The fix
Before students sit a quiz, work through every question yourself and compare it to your lesson notes to confirm it tests the right concepts at the right level.
When Google Gemini summarises a chapter for your class or you paste text into ChatGPT to condense it, you have a clear, usable summary. You might then assume students understand the material because they have read the summary, even though they never had to process the original text or hold contradictions in mind.
The fix
Use AI summaries only as a reference for yourself when designing tasks, never as a substitute for students engaging with primary material.
A student uses ChatGPT to check their maths work or uses Diffit to refine an argument, and the final answer is right. You mark it as correct without asking whether the student found the answer or the AI did.
The fix
For any high-stakes assessment, require students to show their working and explain their reasoning, and mark these elements separately from the final answer.
Magic School AI can produce differentiated activities in seconds, which is seductive when you have three year groups to teach. You may stop observing which students actually struggle with what, because the tool promises to handle it. Your sense of your class's thinking deteriorates.
The fix
Use AI-generated activities only as a starting point, and always pilot them with your class first to see where your students actually get stuck.
ChatGPT writes in a neutral, accessible voice, which is why teachers copy its explanations into lesson slides. But a textbook explanation is often designed to teach, while ChatGPT is designed to be understood instantly. The textbook may contain productive struggle or careful scaffolding that ChatGPT flattens.
The fix
Read both the textbook and ChatGPT, then write your own explanation or teach it aloud first, keeping the textbook's structure and the moment you slow down.
When ChatGPT creates a marking scheme for an essay or test, it covers the main points but may miss the common errors you know your students make, or credit answers you would not accept. You then mark unfairly because you are following a computer's logic instead of your own professional judgement.
The fix
Ask ChatGPT for the marking scheme, then add a section for common misconceptions and edge cases based on what you have seen your students do before.
When you paste a lesson plan into Google Gemini or ask Magic School AI to generate explanations, you get polished content. You then read it aloud or use it as slides, skipping the moment where you think on your feet, notice confusion, and find a new analogy that works for this room on this day. Your teaching becomes delivery instead of responsive work.
The fix
Always teach a lesson aloud to yourself first before using any prepared content, so you remember what you are trying to make happen and stay present to your students.
Diffit and Magic School AI can produce variations of tasks for different ability levels, which saves time. But if you do not track which students get which version or why, you lose sight of progression and may leave some students in the same tier too long.
The fix
Keep a simple record of which differentiated tasks you gave to which students and what you observed, so you can adjust next time instead of repeating the same version.
When a student is stuck, Khanmigo can prompt them or give hints, which feels like support. But you may reach for it too quickly, before the student has spent time being confused. Confusion is where thinking happens. By removing it, you remove the condition for learning.
The fix
When a student is stuck, ask them to talk through what they have tried and what they are unsure about, wait for at least thirty seconds of silence, then consider Khanmigo only if they need a very specific hint.
A student uses ChatGPT to finish their homework or Diffit to improve their writing, and completes the task. You see completion and assume competence, even though the student may never have attempted the hard part themselves.
The fix
Set a clear classroom rule: you check the first draft before any AI tool is used, so you know what the student produced independently.
ChatGPT and Google Gemini sound authoritative and can contain factual errors. If you do not teach students to verify what AI produces, they will trust it because it reads well, losing their ability to question sources.
The fix
When you use AI in front of the class, pause and ask students to fact-check one claim with you, then ask them to identify what would make them doubt the AI and where they would check.
You ask students to summarise a text, upload it to ChatGPT to check if it is AI-generated, and call it assessment. But a summary task invites plagiarism. You are fighting the design with a tool instead of changing the task.
The fix
Ask students to answer a question that requires them to combine the text with something from class, or to disagree with the author, so their own thinking is visible in what they submit.
Teacherss often say students can use ChatGPT or Google Gemini to brainstorm or check their thinking. But if students have not learned to recognise confusion, generate their own ideas first, or challenge an AI response, these tools become shortcuts instead of thinking partners.
The fix
Before students use any AI for a task, model the behaviour aloud: show them how you brainstorm alone first, then ask AI a specific question, then disagree with part of its answer.
Worth remembering
The Book — Out Now
Read the first chapter free.
No spam. Unsubscribe anytime.