Lesson 406 of 1570
When AI Voice-Clones Pretend to Be Your Friend
Three seconds of audio is enough to clone someone's voice now. Scammers use it on teens too.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1When AI Voice-Clones Pretend to Be Your Friend
- 2voice cloning
- 3vishing
- 4family code word
Concept cluster
Terms to connect while reading
Section 1
When AI Voice-Clones Pretend to Be Your Friend
Three seconds of audio is enough to clone someone's voice now. Scammers use it on teens too.
What to actually do
- Set up a family code word that any voice call has to say
- Fake voices often pause weirdly or breathe at wrong times
- Any urgent 'emergency' call asking for money is a giant red flag
Key terms in this lesson
The big idea: Voices can be faked now. Code words and call-backs are how real people prove they're real.
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “When AI Voice-Clones Pretend to Be Your Friend”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Builders · 18 min
When Someone Clones a Voice
AI now needs only 3 seconds of audio to clone a voice.
Builders · 18 min
The Grandkid in Trouble Scam
Scammers clone a kid's voice from social media and call grandparents pretending to be in trouble — needing bail or hospital money fast.. The voice on the phone sounded exactly like her grandson — because it was his voice, AI-cloned from TikTok.
Builders · 9 min
Music Remixes With AI: What's Legal and What's Not
Suno and Udio can generate full songs in seconds. The technology is amazing — and the legal stuff is messy. Here's what you need to know to remix safely.
