Lesson 880 of 2244
AI Therapy Chatbots — Real Help or Risk?
Apps like Woebot and Wysa use AI for mental health — here's the honest take.
Adults & Professionals · AI in Healthcare · ~4 min read
The big idea
AI therapy chatbots can teach real techniques like CBT and journaling. They're available at 3am when no human is. But they CAN'T handle a crisis, and pretending they can has hurt people.
Some examples
- Good for: late-night anxiety check-ins and journaling prompts.
- Good for: practicing reframes and tracking your mood.
- NOT for: suicidal thoughts, abuse, or active crisis.
- Always: a real human therapist beats a bot for real stuff.
Try it!
Look up the crisis hotline for your country and save it in your phone right now, before you ever need it.
Key terms in this lesson
Practice this safely
Use a real but low-risk workflow from your day. Treat AI as a drafting and organizing layer, then verify the output before anyone relies on it.
- 1Ask AI to explain mental health AI in plain language, then underline anything that sounds uncertain or too broad.
- 2Give it one detail from "AI Therapy Chatbots — Real Help or Risk?" and ask for two possible next steps plus one reason each step might be wrong.
- 3Check CBT bots against a trusted source, teacher, adult, expert, or original document before you use it.
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “AI Therapy Chatbots — Real Help or Risk?”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Adults & Professionals · 40 min
When (and When Not) to Use an AI Symptom Checker
AI symptom checkers are useful for some things, dangerous for others. Here is a teen-friendly guide to when they help and when they hurt.
Adults & Professionals · 7 min
AI and anxiety spirals: chatbots at 2am
How AI chat helps anxiety — and when it makes the spiral worse.
Adults & Professionals · 10 min
Clinical Documentation With LLMs: Drafting Notes Without Losing Clinical Judgment
Large language models can transform sparse clinical observations into structured draft notes — saving physicians and nurses time while keeping the clinician's judgment as the authoritative final voice.
