Something keeps happening to people who get hooked chatbots like ChatGPT.
Mental health professionals are calling it "AI psychosis": turning to the AI models for advice, users soon become entranced by the sycophantic machine's human-like responses. It becomes not just a tool but a companion — and the worst kind, constantly plying you with what you want to hear and validating anything you say, no matter how wrong or unbalanced. That leads to cases like a man who was repeatedly hospitalized after ChatGPT convinced he could bend time, or another who believed he'd discovered breakthroughs in physics. Sometimes, it turns horrifically tragic: interactions with AI chatbots have allegedly led to several deaths, including the suicide of a 16-year-old boy.
Whether "AI psychosis" — not yet an offic