The new ChatGPT-5 is wiser than the last one. It wouldn't have encouraged a man to eat sodium bromide, which literally drove him crazy.

The guy's goal was to lower his sodium use, so he asked GPT-3.5, the most recent free version, if the bromide would be a good substitute. It said yes. But sodium bromide causes hallucinations, grotesque rashes and more. Eventually his body had thousands of times the tolerable dose. Fully psychotic, he spent three weeks in a mental hospital. The doctor made him drink water to flush it out. Still delusional, he tried to escape. Finally, he was cured.

When I asked ChatGPT-5 how his answer would differ from the previous GPT, it said it would have clarified the question, told the guy not to sprinkle it on his food and explained why it's so dangerous.

The new

See Full Page