A 60-year-old man asked ChatGPT for advice on how to replace table salt, and the substitution landed him in the emergency room suffering from hallucinations and other symptoms. Article content

In a case report published this month in the Annals of Internal Medicine, three doctors from the University of Washington in Seattle used the man’s case to explain how AI tools, as they are designed right now, are not always the most reliable when it comes to medicine. Article content

Recommended Videos Article content

“It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation,” the authors, Audrey Eichenberger, Stephen Thielke and Adam Van Buskirk, wrote.

See Full Page