Lesson 1309 of 1570
Protecting Grandparents From AI Voice-Cloning Scams
AI-cloned voice scams cost Americans $2.7B in 2024 alone. Grandparents are the #1 target. You're often the first defense.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The big idea
- 2How to Warn Your Parents About AI Voice Scams (Without Sounding Crazy)
- 3The big idea
Concept cluster
Terms to connect while reading
Section 1
The big idea
Voice-cloning AI now needs only 3 seconds of someone's voice — easily scraped from any TikTok, Instagram Reel, or voicemail — to produce a real-time clone that can call grandparents pretending to be 'me, in jail, please send bail money.' The FTC reported $2.7B in voice-scam losses in 2024.
Some examples
- Set up a family code word — a random word grandparents can ask if a 'family member' calls in distress. AI doesn't know it.
- Lock down public-facing audio: TikToks, voicemail greetings, and Instagram Reels with your voice are training data for scammers.
- If a grandparent gets a 'family emergency' call, the rule is: hang up, call the family member back at their known number, then decide.
- Walk grandparents through it before it happens — after they've been scammed, $5,000-$15,000 is usually gone in 90 minutes.
Try it!
Pick a random, slightly weird family code word — something an outsider couldn't guess but everyone in the family will remember (a pet's name spelled backwards, an inside joke). Text it to all grandparents and parents this week. Explain why.
Section 2
How to Warn Your Parents About AI Voice Scams (Without Sounding Crazy)
Section 3
The big idea
AI voice cloning makes the 'grandparent scam' (a sobbing 'grandchild' asking for bail money) terrifyingly believable — the FTC logged $2.7B in imposter scams in 2023, and grandparents are top targets. The fix is the same family safe word from the voice-clone lesson, but the harder part is convincing your parents and grandparents to take it seriously without sounding like a doomer. The framing that works: 'This isn't paranoia, it's the same as a fire drill — five minutes today saves an awful day later.'
Some examples
- FBI 2024 PSA: AI voice scams against seniors are now in the top 5 elder-fraud categories nationwide.
- AARP's 2024 fraud watch: median grandparent-scam loss is $9,000 — and it climbs into six figures when AI voice + caller ID spoof are combined.
- The fix: a 'family code word' EVERY family member knows, especially grandparents — anyone calling for emergency money has to say it.
- Show grandparents an example AI voice clone (ElevenLabs has a public demo) — the 'oh wow' moment is what makes them take it seriously.
Try it!
At the next family dinner or video call, demo an AI voice clone (or play a YouTube example), then propose a family safe word. Make it weird so it's memorable. Bonus: write it on a sticky note on grandma's phone.
Key terms in this lesson
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “Protecting Grandparents From AI Voice-Cloning Scams”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Builders · 22 min
Explaining AI to Parents Who Think It's Just ChatGPT
Most parents have a five-year-out-of-date picture of AI. Updating them helps them parent better and trust you more.
Builders · 22 min
Helping Grandparents Avoid AI Voice-Cloning Scams
Older relatives are the #1 target for AI voice scams in 2026. Your role might be more important than you think.
Builders · 40 min
When AI Is the Wrong Helper for the Real Stuff
There are some conversations AI can't replace — even though it's tempting to ask the bot first.
