ChatGPT diet advice poisoning has become a cautionary tale after a 60-year-old man developed bromism—bromide intoxication—by following unsafe AI guidance. Bromism was common a century ago, but it is rare today. This case shows how persuasive AI answers can still be dangerously wrong.
The man wanted to eliminate table salt (sodium chloride) from his diet. Instead of cutting back, he searched for a full substitute. After asking an AI chatbot, he replaced salt with sodium bromide. That compound once appeared in old sedatives and some industrial products. However, it is not safe to use as food.
He used sodium bromide in every meal for three months. Then a wave of symptoms hit. He developed paranoia, auditory and visual hallucinations, severe thirst, fatigue, insomnia, poor coordination, facial acne, cherry angiomas, and a rash. He feared his neighbor was poisoning him, avoided tap water, and distilled his own. When he tried to leave the hospital during evaluation, doctors placed him on an involuntary psychiatric hold for his safety....<<<Read More>>>...
Welcome to "A Light In The Darkness" - a realm that explores the mysterious and the occult; the paranormal and the supernatural; the unexplained and the controversial; and, not forgetting, of course, the conspiracy theories; including Artificial Intelligence; Chemtrails and Geo-engineering; 5G and EMR Hazards; The Net Zero lie ; Trans-Humanism and Trans-Genderism; The Covid-19 and mRNA vaccine issues; The Ukraine Deception ... and a whole lot more.