Lesson 353 of 1570
Do Not Confide in AI Chatbots
AI chatbots feel like a friend.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1Do Not Confide in AI Chatbots
- 2training data
- 3data leakage
- 4privacy policy
Concept cluster
Terms to connect while reading
Section 1
Do Not Confide in AI Chatbots
AI chatbots feel like a friend. They are actually a service — and what you tell them might be saved, used to train future AI, or seen by humans reviewing data.
Famous case: in 2024, employees at a major car company shared confidential code with ChatGPT to debug it. The code became part of OpenAI's training data.
Three things never to share
- Passwords, PINs, or codes
- Information that identifies you
- Secrets that aren't yours to share
Key terms in this lesson
The big idea: AI chatbots feel private. They are not. Treat them like a stranger at a bus stop.
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “Do Not Confide in AI Chatbots”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Builders · 40 min
Who Owns AI-Generated Art?
This is one of the biggest legal questions of 2026 — and the courts are still figuring it out..
Builders · 7 min
AI Conversations Are Not Truly Private
Stuff you tell AI may be logged, used for training, or even seen by humans. Treat AI conversations like public, not private.
Builders · 40 min
AI 'companion' apps: what they want from you
AI girlfriend / boyfriend / friend apps are designed to be addictive. Here's what they're actually doing.
