Lesson 191 of 1550
When AI Gets It Wrong: Teaching Kids to Catch Hallucinations
AI models confidently state false things. Teaching kids to catch this builds a critical lifelong habit — but the lesson is more about general skepticism than AI specifically.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The premise
- 2hallucination
- 3fact-checking
- 4confidence calibration
Concept cluster
Terms to connect while reading
Section 1
The premise
AI confidence does not equal AI accuracy; kids need to learn to verify what AI tells them, especially in domains where it sounds most authoritative.
What AI does well here
- Show kids specific examples where AI confidently states something false (history, science, current events)
- Build the verification habit — name a primary source for any claim that matters
- Talk about why AI sounds confident even when wrong (training, no built-in 'I don't know')
- Make 'check the source' the family mantra
What AI cannot do
- Make kids skeptical of every AI output (that's exhausting and unhelpful)
- Substitute for actual fact-checking (which is a skill)
- Replace school instruction in critical reading
Key terms in this lesson
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “When AI Gets It Wrong: Teaching Kids to Catch Hallucinations”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Adults & Professionals · 40 min
Screen Time vs. AI Time: Why the Categories Are Already Outdated
Screen-time guidelines from 2018 don't account for kids using AI as a homework partner or creative collaborator. Parents need a new framework — one that distinguishes consumption from interaction, passive from generative.
Adults & Professionals · 9 min
Homework Help With AI: House Rules That Build Skills Instead of Replacing Them
AI can do your kid's homework — but it can also explain a concept three different ways until it clicks. The difference is in the house rules. Here's a framework parents can adopt this week.
Adults & Professionals · 12 min
AI Companion Apps: What Parents Need to Know About Replika, Character.AI, and the Rest
AI companion apps have exploded in popularity with teens. Some are benign, some have genuinely harmed kids. Parents need to know how the apps work, what the risks are, and how to talk about them at home.
