We were promised empathy in a box: a tireless digital companion that listens without judgment, available 24/7, and never sends a bill. The idea of AI as a psychologist or therapist has surged alongside mental health demand, with apps, chatbots, and “empathetic AI” platforms now claiming to offer everything from stress counseling to trauma recovery.
It’s an appealing story. But it’s also a deeply dangerous one.
Recent experiments with “AI therapists” reveal what happens when algorithms learn to mimic empathy but not understand it. The consequences range from the absurd to the tragic, and they tell us something profound about the difference between feeling heard and being helped.
When the chatbot becomes your mirror
In human therapy, the professional’s job is not to agree with you, but t

Fast Company Lifestyle

13 On Your Side
Verywell Health
The Conversation
NBC News
CNN Health
NBC Connecticut
TODAY Health
America News