Lesson 21 of 1234
Deepfakes: When a Fake Looks Like Someone You Know
A deepfake is a fake video or voice that looks and sounds like a real person. Here is what they are, why they hurt people, and what to do if you see one.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1What Is a Deepfake?
- 2deepfake
- 3voice cloning
- 4reporting
Concept cluster
Terms to connect while reading
Section 1
What Is a Deepfake?
A deepfake is a fake video or audio clip where an AI puts one person's face or voice on top of something they never did or said. The word comes from deep learning and fake.
Some deepfakes are funny, like a silly video of a celebrity dancing to a new song. Some are mean and used to embarrass or trick people. Some are scary and used to scam or bully someone.
Why deepfakes hurt, even if they are fake
A mean deepfake of you can feel just as awful as a real mean picture of you. Maybe worse, because you never did the thing. Once it is out there, it is hard to delete. It can stick with the person for years.
- People might believe the fake is real
- The target might be too scared to go to school
- The target's family sees it too
- Even if everyone knows it is fake, the feeling is still bad
What to do if you see a deepfake of someone
- 1Do not share it. Not even to make fun of it. Every share spreads the harm.
- 2Do not comment on it. Comments push it higher in the feed.
- 3Screenshot it as evidence, then report it to the app or site.
- 4Tell a grown-up you trust. A parent, a teacher, or a school counselor.
- 5If it is a friend or classmate, reach out to them privately. They probably feel terrible.
The law is catching up
In 2024 and 2025, countries around the world passed new laws against deepfake harassment. In the US, the Take It Down Act was signed in 2025, which requires websites to remove harmful deepfakes within 48 hours. Other places have similar rules now.
“It is easier to make a deepfake than to clean one up. That is why we care so much about stopping them before they start.”
Key terms in this lesson
The big idea: deepfakes can hurt real people. Do not make them, do not share them, and if you see one, report it. Being kind online is one of the most powerful things you can do.
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “Deepfakes: When a Fake Looks Like Someone You Know”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Explorers · 15 min
The Golden Rule, But With AI
You can do things with AI you could never do before. That means you can also hurt people in new ways. Here is the simple rule that keeps you on the right side of the line.
Explorers · 20 min
Real or Fake? Spotting AI Pictures and Videos
AI can now make pictures and videos that look absolutely real. Here are the signs to look for and the habits that will keep you smart.
Explorers · 15 min
AI Can Be Totally, Confidently Wrong
AI sounds sure of itself even when it is making stuff up. Here is how to notice when it is wrong and what to do about it.
