A man self-inflicted a rare ailment after seeking ChatGPT's dietary advice
AI/Software

A man self-inflicted a rare ailment after seeking ChatGPT's dietary advice

A man diagnosed with bromism due to misinterpreting ChatGPT's advice on salt intake highlights the dangers of relying on AI for health guidance.

A man self-inflicted a rare ailment after seeking ChatGPT’s dietary advice

A recent case reveals the serious risk of relying on AI for health-related advice. A 60-year-old man was admitted to the hospital claiming that his neighbor was poisoning him. His symptoms included fatigue, insomnia, excessive thirst, and skin issues, leading to a diagnosis of bromism—caused by excessive intake of sodium bromide instead of sodium chloride (table salt).

This unusual condition, once common but rarely seen now, resulted from the man’s misguided attempts to reduce salt in his diet. After reading about sodium chloride’s health risks, he consulted ChatGPT, which incorrectly suggested sodium bromide as a replacement. This resulted in severe health issues, raising concerns about the accuracy of AI-generated advice in medical contexts.

Image (Image credit: Valve)

Note: The individual in this case likely used an outdated version of ChatGPT, which failed to provide suitable health warnings during the conversation.

Next article

Highly Anticipated Demonschool RPG Sets Its Release Date

Newsletter

Get the most talked about stories directly in your inbox

Every week we share the most relevant news in tech, culture, and entertainment. Join our community.

Your privacy is important to us. We promise not to send you spam!