Following a string of reports about users developing intensely unhealthy relationships with ChatGPT , OpenAI has formulated a response . “There have been instances where our 4o model fell short in recognizing signs of delusion or emotional dependency,” the company says, so it’s “developing tools to better detect signs of mental or emotional distress so ChatGPT can respond appropriately.” When it comes to “personal challenges,” such as relationship advice, the company says “ChatGPT shouldn’t give you an answer” but rather “help you think it through.” Never fear: “New behavior for high-stakes personal decisions is rolling out soon.”
A third item stood out as somewhat more familiar: “Starting today, you’ll see gentle reminders during long sessions to encourage breaks. We’ll keep tu