A 60-year-old man is recovering after a bizarre and dangerous case of self-poisoning that started with a simple question to ChatGPT. According to a report by KTLA and the Annals of Internal Medicine, the man wanted to cut salt from his diet and asked the AI platform for a substitute. But instead of a healthy alternative, he claims the AI suggested sodium bromide -- a chemical found in pesticides.
For more shocking AI-related incidents, check out our coverage of tech-related health risks on Hollywood Unlocked.
The man purchased the chemical online and used it for three months. He ended up in the hospital with severe symptoms, not realizing the "dietary advice" he followed had turned toxic.
RELATED STORIES: Woman Divorces Husband Over ChatGPT Prompt That Accused Him of Cheating
What Happened Before the Hospitalization
The report says the man's goal was simple: stop eating salt for health reasons. But instead of turning to a nutritionist or credible medical website, he went to ChatGPT for answers. Allegedly, the AI recommended sodium bromide as a salt replacement.
This chemical isn't meant for human consumption. It's used in pesticides and industrial applications, and ingesting it can cause serious health issues. Despite this, the man ordered it online and sprinkled it into his meals for months.
By the time he reached the hospital, he was showing signs of bromide toxicity, a rare but serious condition that can affect the nervous system. According to the medical report, he was convinced his neighbor was trying to poison him.
Inside the Medical Diagnosis
Doctors were stunned to discover the root cause of his illness. "In the first 24 hours of admission, he expressed increasing paranoia and auditory and visual hallucinations, which, after attempting to escape, resulted in an involuntary psychiatric hold for grave disability," the report stated.
The toxicity had caused severe neurological symptoms. He was seeing things, hearing voices, and struggling with delusions. This paranoia wasn't the only problem -- tests showed he had multiple nutrient deficiencies, including vitamin C, B12, and folate.
The Treatment and Recovery
Medical staff acted quickly, administering IV treatments and electrolytes to flush the toxin from his system. Over the next three weeks, doctors corrected his nutrient deficiencies and monitored his mental state.
Thankfully, the man made a full recovery. While doctors didn't comment on whether ChatGPT directly caused the incident, the case highlights the serious risks of taking health advice from unverified online sources, even if it comes from advanced AI.