Lesson 502 of 1570
What AI Apps Actually Do With Your Data: Read the Fine Print
Every AI app has a privacy policy that says what happens to your stuff. Most teens never read them. Here is what to look for.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The big idea
- 2privacy policies
- 3data use
- 4training data
Concept cluster
Terms to connect while reading
Section 1
The big idea
Privacy policies are boring on purpose — companies hope you skip them. But they tell you exactly what happens to your data. The 5 minutes it takes to skim is worth it.
Some examples
- Look for: 'We may use your data to train our AI' — that means your conversations help build future AI.
- Look for: 'We share data with third parties' — your stuff goes to other companies too.
- Look for: 'Data retention period' — how long they keep what you share.
- Look for: 'Your rights' — can you delete your data, see what they have, opt out of training?
Try it!
Pick one AI app you use regularly. Find the privacy policy. Skim it for 5 minutes. Note: do they train on your data? Can you delete it? Decide if you are okay with the answers.
Key terms in this lesson
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “What AI Apps Actually Do With Your Data: Read the Fine Print”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Builders · 7 min
AI Conversations Are Not Truly Private
Stuff you tell AI may be logged, used for training, or even seen by humans. Treat AI conversations like public, not private.
Builders · 40 min
AI 'companion' apps: what they want from you
AI girlfriend / boyfriend / friend apps are designed to be addictive. Here's what they're actually doing.
Builders · 7 min
AI image generators trained on stolen art
Many AI art tools were trained on artwork without permission. Knowing this helps you choose ethically.
