Artificial Intelligence

Man Poisoned After Following ChatGPT’s Deadly Health Advice

When it comes to your health, you might want to stick to trusted medical sources instead of turning to artificial intelligence for answers. A new case from Washington State has revealed how advice from ChatGPT led a man to severe poisoning, triggering a dangerous mix of mental and physical health issues.

According to a report published in the Annals of Internal Medicine: Clinical Cases, a 60-year-old man with no prior medical or psychiatric history arrived at the emergency department convinced that his neighbor was poisoning him. His physical examination was largely normal, except for extreme thirst — yet he refused offered water out of paranoia.

Within 24 hours, his condition deteriorated rapidly. The patient began experiencing intense paranoia, hallucinations, and even attempted to escape the hospital, resulting in an involuntary psychiatric hold.

Doctors soon discovered the problem wasn’t just dehydration. Blood tests revealed severe micronutrient deficiencies, including vitamin C, vitamin B12, and folate. The patient also reported new skin issues like acne and cherry angiomas, poor sleep, fatigue, and even ataxia — problems with balance, speech, and swallowing.

The underlying cause was surprising: the man had recently decided to cut down on table salt after reading about its health risks. However, instead of reducing sodium chloride intake, he attempted to eliminate chloride entirely — a critical mineral for human health. Following advice he said came from ChatGPT, he replaced sodium chloride with sodium bromide purchased online.

The switch proved dangerous. While bromide can replace chloride in industrial uses like pool cleaning or fire retardants, it is toxic when ingested. Bromide poisoning can cause nausea, skin problems, and severe psychiatric symptoms. It was once a common cause of psychiatric hospital admissions before being banned from over-the-counter medications in 1975.

The patient’s bromide levels were found to be 1,700 mg/L — roughly 233 times the safe limit. Thankfully, after stopping the bromide “salt” and receiving medical treatment, his symptoms improved within three weeks.

The case highlights the potential risks of relying on AI for medical advice. While ChatGPT’s response reportedly noted that “context matters,” it failed to provide a clear health warning or ask follow-up questions a doctor would typically consider.

Doctors warn: Always consult a qualified medical professional for health decisions — and never replace essential minerals with toxic substances.