Lesson 1208 of 1570
AI Fake Celebrity Ads: Why MrBeast and Taylor Swift Scams Keep Working
AI voice clones of MrBeast giving away iPhones aren't pranks — they're FTC-actionable fraud, and resharing makes you liable.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The big idea
- 2voice clone
- 3FTC
- 4celebrity rights
Concept cluster
Terms to connect while reading
Section 1
The big idea
In 2024 a deepfake MrBeast iPhone giveaway cost TikTok users millions. The teens who reshared it 'as a joke' got named in class-action discovery.
Some examples
- MrBeast publicly disowns these — his real giveaways never ask for shipping fees
- Taylor Swift Le Creuset scam ran on Meta for 6 weeks
- ElevenLabs voice clones are watermarked but TikTok strips them
- Resharing = secondary liability under the FTC Act
Try it!
Find one suspicious 'celebrity giveaway' on your FYP today. Use TikTok's report tool, category 'scam or fraud.' Don't reshare even ironically.
Key terms in this lesson
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “AI Fake Celebrity Ads: Why MrBeast and Taylor Swift Scams Keep Working”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Adults & Professionals · 11 min
AI and synthetic voice consent: scoping and revocation
Build voice-clone consent records that are scope-limited, time-bound, and revocable — and design the revocation flow before launch.
Adults & Professionals · 9 min
AI and Paid Promotion Disclosure: FTC-Safe Ad Labels
AI helps creators draft FTC-compliant paid promotion disclosure that survives a regulator's read.
Builders · 18 min
When Someone Clones a Voice
AI now needs only 3 seconds of audio to clone a voice.
