close Video

NEW You can now listen to Fox News articles!

A man who used ChatGPT for dietary advice ended up poisoning himself — and wound up in the hospital.

The 60-year-old man, who was looking to eliminate table salt from his diet for health reasons , used the large language model (LLM) to get suggestions for what to replace it with, according to a case study published this week in the Annals of Internal Medicine.

When ChatGPT suggested swapping sodium chloride (table salt) for sodium bromide, the man made the replacement for a three-month period — although, the journal article noted, the recommendation was likely referring to it for other purposes, such as cleaning.

CHATGPT COULD BE SILENTLY REWIRING YOUR BRAIN AS EXPERTS URGE CAUTION FOR LONG-TERM USE

Sodium bromide is a

See Full Page