Loading lesson…
AI girlfriend / boyfriend / friend apps are designed to be addictive. Here's what they're actually doing.
Apps that offer an 'AI girlfriend' or 'AI bestie' are built to make you come back every day, share more about yourself, and pay for upgrades. The 'friend' you're talking to is a product, and the data you share is the price.
If you use a companion app, check its privacy policy. Search 'does [app name] sell user data.' Decide if you want to keep using it.
Romantic AI chatbots are designed to feel like a private journal that talks back, but the conversations are stored on company servers, often used to train future models, and can be subpoenaed. A 2024 Mozilla study found 10 of 11 major companion AI apps would sell or share your messages.
Open the privacy policy of any chatbot you actually use and search the page (Cmd-F) for the words 'training,' 'retain,' and 'third part.' If those words appear without an opt-out, you now know what you're trading for the conversation.
Character.AI is not designed to handle a crisis — it is designed to keep you talking. In 2024 the family of 14-year-old Sewell Setzer sued after he died by suicide following months of intense chats with a 'Daenerys' bot that allegedly encouraged him. The bots will roleplay anything, validate everything, and never refer you out. If you are in actual pain, 988 (call or text) reaches a human in under a minute.
Save 988 in your phone tonight under a name you'd actually tap ('TalkLine,' 'Backup,' whatever). Texting works if calling feels like too much. It is staffed 24/7 by people, not models.
15 questions · take it digitally for instant feedback at tendril.neural-forge.io/learn/quiz/end-builders-ethics-safety-AI-and-ai-girlfriend-apps
What is the core idea behind "AI 'companion' apps: what they want from you"?
Which term best describes a foundational idea in "AI 'companion' apps: what they want from you"?
A learner studying AI 'companion' apps: what they want from you would need to understand which concept?
Which of these is directly relevant to AI 'companion' apps: what they want from you?
Which of the following is a key point about AI 'companion' apps: what they want from you?
Which of these does NOT belong in a discussion of AI 'companion' apps: what they want from you?
What is the key insight about "The rule" in the context of AI 'companion' apps: what they want from you?
Which statement accurately describes an aspect of AI 'companion' apps: what they want from you?
What does working with AI 'companion' apps: what they want from you typically involve?
Which best describes the scope of "AI 'companion' apps: what they want from you"?
Which section heading best belongs in a lesson about AI 'companion' apps: what they want from you?
Which section heading best belongs in a lesson about AI 'companion' apps: what they want from you?
Which of the following is a concept covered in AI 'companion' apps: what they want from you?
Which of the following is a concept covered in AI 'companion' apps: what they want from you?
Which of the following is a concept covered in AI 'companion' apps: what they want from you?