Lesson 827 of 1570
AI Therapy Chatbots — Real Help or Risk?
Apps like Woebot and Wysa use AI for mental health — here's the honest take.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The big idea
- 2mental health AI
- 3CBT bots
- 4limits
Concept cluster
Terms to connect while reading
Section 1
The big idea
AI therapy chatbots can teach real techniques like CBT and journaling. They're available at 3am when no human is. But they CAN'T handle a crisis, and pretending they can has hurt people.
Some examples
- Good for: late-night anxiety check-ins and journaling prompts.
- Good for: practicing reframes and tracking your mood.
- NOT for: suicidal thoughts, abuse, or active crisis.
- Always: a real human therapist beats a bot for real stuff.
Try it!
Look up the crisis hotline for your country and save it in your phone right now, before you ever need it.
Key terms in this lesson
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “AI Therapy Chatbots — Real Help or Risk?”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Builders · 40 min
When (and When Not) to Use an AI Symptom Checker
AI symptom checkers are useful for some things, dangerous for others. Here is a teen-friendly guide to when they help and when they hurt.
Builders · 7 min
AI and anxiety spirals: chatbots at 2am
How AI chat helps anxiety — and when it makes the spiral worse.
Builders · 40 min
AI Mental Health Apps: Helpful for Some Things, Not Replacement Therapy
Apps like Woebot use AI to help with everyday stress and feelings. Useful for some stuff. Not a replacement for a real therapist or trusted adult.
