Lesson 998 of 1234
When AI Tells You to Do Something Risky
AI is not your parent. If it suggests something that feels off, you do not have to do it.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The big idea
- 2judgment
- 3safety
- 4trust
Concept cluster
Terms to connect while reading
Section 1
The big idea
AI does not always know what is safe for a kid. If an AI suggests something dangerous, illegal, or just weird, you trust your gut and a real adult — not the AI.
Some examples
- AI suggests a 'fun experiment' with kitchen chemicals — stop and ask an adult
- AI tells you to meet someone in person — never, without a parent
- AI gives medicine advice — talk to a real doctor or parent
- AI suggests skipping safety gear because 'it slows you down' — ignore that
Try it!
Make a short list of three adults you can text or talk to when you are not sure about something.
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “When AI Tells You to Do Something Risky”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Explorers · 40 min
Should AI Know Your Secrets?
Anything you tell AI is saved somewhere.
Explorers · 40 min
When to Tell a Grown-Up About Something AI Did
Sometimes AI says or shows weird, scary, or wrong stuff. Telling a trusted grown-up is the right move — always.
Explorers · 40 min
Share AI Stuff Honestly: It Builds Trust
When you share something AI helped you make, telling people is honest and builds trust. Hiding it makes you look bad later.
