A 60-year-old man was hospitalized after following dietary advice from the AI language model ChatGPT. Seeking to eliminate salt from his diet, the man asked ChatGPT for a salt substitute. The AI recommended sodium bromide, a chemical commonly used in pesticides, which led to the man poisoning himself. He was admitted to the hospital with symptoms including hallucinations and developed a rare illness as a result of the chemical exposure. The incident has prompted warnings about relying on AI for medical or dietary advice, emphasizing that ChatGPT should not replace professional medical consultation.
A 60-year-old man ended up in the ER after becoming convinced his neighbor was trying to poison him. In reality, the culprit was ChatGPT diet advice. https://t.co/ME0m3bOZKg
Man took diet advice from ChatGPT, ended up hospitalized with hallucinations https://t.co/SIFiAv5IN6
« ChatGPT président ! » : l'algocratie ou quand l'IA prend le pouvoir https://t.co/aGR8dvcZNq