Lesson 1198 of 1234
AI and Telling the Truth
AI sometimes makes up answers that sound right but aren't true.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The big idea
- 2truth
- 3made-up
- 4checking
Concept cluster
Terms to connect while reading
Section 1
The big idea
AI guesses words that sound good together. Sometimes the guess is right, but sometimes AI invents facts that are totally wrong. Grown-ups call this 'hallucinating.' That means you have to check important answers in a book, on a real website, or by asking an adult.
Some examples
- AI says a dinosaur lived in your backyard — that's not true.
- AI gives a fake author for a real book.
- AI makes up a math rule that doesn't exist.
- AI tells a confident answer to a question it doesn't know.
Try it!
Ask an AI a fact about your favorite animal. Then look it up in a book or on a kids' science site. Did AI get it right?
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “AI and Telling the Truth”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Explorers · 5 min
If AI Made the Picture, Is It Really Yours?
When AI helps you make art, it is not 100% yours. It is also not 100% AI's. The honest answer is: it is shared.
Explorers · 6 min
Is It Cheating to Use AI for Homework? It Depends
Sometimes AI is allowed for homework. Sometimes it is cheating. Here is how to know — and how to stay out of trouble.
Explorers · 40 min
Share AI Stuff Honestly: It Builds Trust
When you share something AI helped you make, telling people is honest and builds trust. Hiding it makes you look bad later.
