Lesson 359 of 1234
If a Deepfake Happens to You, Tell Someone Right Away
If someone makes a fake video or image of you, tell a grown-up immediately. Do not delete evidence. Help is available.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The big idea
- 2deepfake response
- 3reporting
- 4evidence
Concept cluster
Terms to connect while reading
Section 1
The big idea
If you are ever the subject of a deepfake or fake AI image — created without your permission — there are real steps to take. The faster you act, the more help is available.
Some examples
- Step 1: Take screenshots of the deepfake (evidence you will need).
- Step 2: Tell a parent, school counselor, or teacher you trust.
- Step 3: Report to the platform where it was shared.
- Step 4: Depending on what happened, police may be involved (especially for sexual deepfakes — they are now illegal in most US states).
Try it!
Talk to a parent: 'If anyone ever made a deepfake of me, what would we do?' Now you both know the plan.
Key terms in this lesson
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “If a Deepfake Happens to You, Tell Someone Right Away”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Builders · 40 min
Laws Against Deepfakes
As of 2026, most US states have laws against malicious deepfakes — especially deepfake porn and political deepfakes..
Explorers · 7 min
AI and saying no to scary AI content
If AI shows you something scary, you can stop and tell a grown-up.
Builders · 7 min
AI and Someone Generating Mean Essays About You
Classmates can use AI to mass-produce harassment content — here's how to fight back.
