Lesson 666 of 1234
AI Can Copy Voices — Even Your Mom's
AI can clone how someone sounds, which is useful AND a little scary.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The big idea
- 2voice cloning
- 3scams
- 4verification
Concept cluster
Terms to connect while reading
Section 1
The big idea
With just a few seconds of audio, AI can copy how a person talks. Bad guys sometimes use this to trick people into thinking a family member is calling.
Some examples
- A scammer might fake your grandma's voice asking for money.
- Someone could fake a teacher's voice to trick the class.
- AI voices in commercials might sound like real famous people.
- If a 'family member' calls in trouble, ask a question only they would know.
Try it!
Pick a secret 'safe word' with your family. If anyone ever calls in an emergency, they should say the word!
Key terms in this lesson
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “AI Can Copy Voices — Even Your Mom's”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Explorers · 40 min
AI Can Make Fake Things Look Real
AI can make fake pictures, fake videos, and fake voices that look and sound real.
Explorers · 5 min
AI Can Mix History Up: Fact vs Fiction
AI sometimes blends real history with fiction. For school, only use verified history sources, not just AI.
Explorers · 5 min
Just Because AI Said It Doesn't Make It True
AI sounds smart, but you still need to think for yourself.
