A 60-year-old man was hospitalized for three weeks after following dietary advice from the artificial intelligence chatbot ChatGPT. Seeking to eliminate salt from his diet, the man asked ChatGPT for a substitute and was advised to replace sodium chloride (table salt) with sodium bromide, a chemical commonly used in pesticides. After obtaining sodium bromide from the internet and consuming it for approximately three months, he developed bromism, a rare condition historically associated with sodium bromide poisoning. The man experienced severe symptoms including hallucinations and acute psychological effects, which ultimately led to his hospitalization. Medical experts and commentators have highlighted this case as a cautionary example of the risks of relying on AI for medical or dietary advice without professional consultation.
A 60-year-old man asked ChatGPT for advice on how to replace table salt, and the substitution landed him in the emergency room suffering from hallucinations and other symptoms. Read more to find out what ChatGPT suggested ⤵️ https://t.co/i8y1UCSzI1 https://t.co/bWRBs7x2RC
Men will literally rug themselves instead of going to therapy https://t.co/DjQAGZR4eK
Consider not getting health advice from a heartless computer. Man asks ChatGPT for advice on how to cut salt, ends up in hospital with hallucinations https://t.co/xZxUkA49Qj