Search A Light In The Darkness

Thursday, 14 August 2025

Man Nearly Dies After Following ChatGPT Diet Advice

ChatGPT diet advice poisoning has become a cautionary tale after a 60-year-old man developed bromism—bromide intoxication—by following unsafe AI guidance. Bromism was common a century ago, but it is rare today. This case shows how persuasive AI answers can still be dangerously wrong.

The man wanted to eliminate table salt (sodium chloride) from his diet. Instead of cutting back, he searched for a full substitute. After asking an AI chatbot, he replaced salt with sodium bromide. That compound once appeared in old sedatives and some industrial products. However, it is not safe to use as food.

He used sodium bromide in every meal for three months. Then a wave of symptoms hit. He developed paranoia, auditory and visual hallucinations, severe thirst, fatigue, insomnia, poor coordination, facial acne, cherry angiomas, and a rash. He feared his neighbor was poisoning him, avoided tap water, and distilled his own. When he tried to leave the hospital during evaluation, doctors placed him on an involuntary psychiatric hold for his safety....<<<Read More>>>...