Loading lesson…
Apps like Woebot and Wysa use AI for mental health — here's the honest take.
AI therapy chatbots can teach real techniques like CBT and journaling. They're available at 3am when no human is. But they CAN'T handle a crisis, and pretending they can has hurt people.
Look up the crisis hotline for your country and save it in your phone right now, before you ever need it.
Use a real but low-risk workflow from your day. Treat AI as a drafting and organizing layer, then verify the output before anyone relies on it.
15 questions · take it digitally for instant feedback at tendril.neural-forge.io/learn/quiz/end-builders-healthcare-AI-and-therapy-chatbots
Which situation is the BEST time to use an AI therapy chatbot like Woebot or Wysa?
What is one major limitation of AI therapy chatbots that the lesson warns about?
What therapeutic technique can AI chatbots teach users?
A student feels anxious at midnight and can't sleep. What should they do?
What does the lesson say about how a real human therapist compares to a chatbot?
What should someone do immediately if they have suicidal thoughts?
What is 'journaling' in the context of mental health apps?
Why does the lesson call saving a crisis hotline 'emotional intelligence'?
What is a 'reframe' in mental health terms?
What evidence does the lesson provide that AI chatbots have caused harm?
What makes AI therapy chatbots different from human therapists in terms of availability?
Which situation would be INAPPROPRIATE to use an AI therapy chatbot for?
What is CBT an abbreviation for?
What did the lesson suggest students do with a crisis hotline number?
Why might someone choose to use an AI chatbot instead of waiting for a therapist appointment?