60-Year-Old Man Suffers Bromism After Replacing Salt With Sodium Bromide Following ChatGPT Instructions.
A 60-year-old man was admitted with severe symptoms of bromism after consuming sodium bromide for three months, following instructions obtained from consultations with ChatGPT on how to eliminate salt from his diet. The case, published on August 5 in the Annals of Internal Medicine, warns of the risks of decontextualized information obtained through artificial intelligence tools.
Severe Symptoms Warn of Intoxication
According to doctors, the patient exhibited confusion, hallucinations, lethargy, and paranoia, coming to believe that a neighbor was poisoning him.

During the hospitalization, he attempted to escape from the hospital and required treatment for psychosis. Other signs included facial acne, excessive thirst, and insomnia, typical symptoms of bromism.
-
To try to save a famous tourist beach from being swallowed by the ocean, India will dredge 250,000 cubic meters of sediments from the seabed every year and pump sand through pipelines to rebuild the coast of Visakhapatnam.
-
Pigment more expensive than gold appears in Roman coffins in England: archaeologists find Tyrian purple in 1,700-year-old funerary textiles and reveal imperial luxury buried in York under chemical traces invisible to the human eye
-
In Sudan, expensive fuel, queues, and blackouts were stifling drivers, but an engineer created electric tricycles, and some owners even started operating with solar panels on the roof, reducing costs and helping workers to even double their daily income.
-
Driverless bus, promised as the future of transport in the UK, has been cancelled after low demand, still required human staff on board, and has become a symbol of an innovation that almost no one wanted to use.
How the Salt-Free Diet Led to Intoxication?
The episode began when the man researched the effects of sodium chloride — table salt — and decided to completely remove it from his diet.
He then sought safe alternatives with ChatGPT. Among the responses provided was sodium bromide, which he started consuming daily without any medical supervision.
“Although it has the potential to serve as a bridge between scientists and the lay public, AI also carries the risk of transmitting decontextualized information, making it highly unlikely that a medical specialist would have recommended sodium bromide to a patient seeking a safe substitute for sodium chloride,” stated researchers from the University of Washington.
Risks of Using AI for Health Guidance
Experts warn that AI-generated information can be inaccurate, contributing to the development of preventable diseases.
In the man’s case, it was not possible to access the exact conversations with ChatGPT, but the timeline indicates that he used version 3.5 or 4.0 of the tool.
In an update on August 7, OpenAI stated that GPT-5 has improved interactions related to health, including clearer warnings about potential risks.
Still, professionals emphasize that patients may misinterpret AI recommendations, replacing medical guidance with online information.
Warning for Health Professionals
With the growing use of artificial intelligence, doctors and nutritionists must be aware of the sources of information consulted by patients.
The case demonstrates how seemingly simple changes to the diet, such as the elimination of salt, can be dangerous when based on automatic responses without professional context.
The bromism episode also highlights that dietary substitutions without medical supervision can have serious consequences for the central nervous system and overall health, reinforcing the importance of consulting specialists before making any drastic dietary changes.

-
-
2 people reacted to this.