
A man self-inflicted a rare ailment after seeking ChatGPT’s dietary advice
A recent case reveals the serious risk of relying on AI for health-related advice. A 60-year-old man was admitted to the hospital claiming that his neighbor was poisoning him. His symptoms included fatigue, insomnia, excessive thirst, and skin issues, leading to a diagnosis of bromism—caused by excessive intake of sodium bromide instead of sodium chloride (table salt).
This unusual condition, once common but rarely seen now, resulted from the man’s misguided attempts to reduce salt in his diet. After reading about sodium chloride’s health risks, he consulted ChatGPT, which incorrectly suggested sodium bromide as a replacement. This resulted in severe health issues, raising concerns about the accuracy of AI-generated advice in medical contexts.
Image
(Image credit: Valve)
Note: The individual in this case likely used an outdated version of ChatGPT, which failed to provide suitable health warnings during the conversation.