Lesson 1385 of 1570
What to Do the First Hour of an AI Sextortion Scam
Scammers use AI to fake nudes from your public photos and demand crypto. The first 60 minutes decide how it ends.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The big idea
- 2AI and Sextortion Scams: How to Spot and Survive Them
- 3The big idea
- 4AI and Sextortion Scam Scripts: Spot the Pattern Before You Send
Concept cluster
Terms to connect while reading
Section 1
The big idea
AI sextortion is when a stranger DMs you AI-generated nudes of yourself — built from your Instagram photos — and threatens to send them to your contacts unless you pay. The FBI says do NOT pay (it never stops), do NOT delete (you need the evidence), and DO report to NCMEC's CyberTipline within hours. Payment escalates threats; reporting freezes accounts.
Some examples
- An Instagram DM with a nudified photo of you arrives at 11pm — you screenshot the message, the username, and the bitcoin wallet, then block but don't delete the chat.
- You go to report.cybertip.org and file a CyberTipline report (free, anonymous if you want, run by NCMEC) — this triggers actual federal investigation.
- You also use takeitdown.ncmec.org to push a hash of the fake image to Meta, TikTok, Snap, X, and Reddit so it cannot reupload.
- You tell ONE adult — even an awkward conversation tonight beats the alternative; the FBI's PSA literally says teen suicides have followed silence.
Try it!
Save report.cybertip.org and takeitdown.ncmec.org as bookmarks on your phone right now. If it ever happens to you or a friend, you'll have the playbook open before panic sets in.
Key terms in this lesson
Section 2
AI and Sextortion Scams: How to Spot and Survive Them
Section 3
The big idea
AI sextortion has exploded — scammers grab your photo from Instagram, generate a fake nude, and threaten to send it to your contacts unless you pay. The FBI says hundreds of teens have died by suicide from these scams. Paying makes it worse. Reporting fast is the move.
Some examples
- The FBI ran a 2024 alert on AI-fueled teen sextortion.
- Scammers usually demand $500-$2000 in gift cards or crypto.
- Block, screenshot, then report at CyberTipline.org and tell a parent.
- NCMEC's Take It Down service can help remove fake images of minors.
Try it!
Save these in your phone notes right now: CyberTipline.org and TakeItDown.NCMEC.org. You may never need them — but a friend might.
Section 4
AI and Sextortion Scam Scripts: Spot the Pattern Before You Send
Section 5
The big idea
Sextortion crews use AI to generate flawless flirty messages in any language, then pivot to demanding money once a teen sends a photo. The opening lines follow a pattern — once you see it, you cannot unsee it.
Some examples
- Ask Claude to list the 8 most common sextortion opening lines so you recognize them in your DMs.
- Ask ChatGPT what to do in the first 10 minutes if a friend already sent something.
- Ask Gemini how Take It Down and the FBI tip line work for under-18 victims.
- Ask Perplexity for the latest sextortion case stats to share with friends who think it cannot happen to them.
Try it!
Open your DMs from this week. Run any flirty stranger message past Claude and ask 'is this a sextortion opener?' Get a second opinion in 30 seconds.
Key terms in this lesson
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “What to Do the First Hour of an AI Sextortion Scam”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Builders · 40 min
Laws Against Deepfakes
As of 2026, most US states have laws against malicious deepfakes — especially deepfake porn and political deepfakes..
Builders · 40 min
Why Misinformation Spreads So Fast
AI-generated misinformation goes viral because outrage and surprise drive shares — and AI is great at making both..
Builders · 40 min
Why an AI Chatbot Isn't a Therapist
AI mental-health bots can listen, but they don't know you, can't call for help, and sometimes give risky advice.
