Science & Tech

Man hospitalised after following health advice from AI

Artificial intelligence mental health study
Fox - 5 NY / VideoElephant

Artificial intelligence is infiltrating nearly every aspect of daily life. From booking trips to writing emails, AI chatbots are becoming go-to tools for quick answers. The phrase "just ask ChatGPT" has become second nature for many. But when it comes to health advice, experts warn we may be leaning too far into uncharted (and unsafe) territory.

One man learned this the hard way. In a case that has alarmed medical professionals and AI sceptics alike, a patient was hospitalised with a rare and dangerous form of poisoning after following dietary advice from ChatGPT.

Trying to cut down on his salt (sodium chloride) intake and improve his overall well-being, the man consulted the chatbot for guidance. According to a detailed medical report in the Annals of Internal Medicine, ChatGPT allegedly advised him to substitute regular salt with sodium bromide.

Without further research or consulting a medical professional, the man sourced sodium bromide online and began incorporating it into his daily routine.

The problem? Sodium bromide is not a dietary supplement. It is commonly used as a water disinfectant, sanitiser, slimicide, bactericide, algicide, fungicide, and molluscicide control agent. This critical context was reportedly missing from the chatbot’s advice.

iStock

Roughly three months after introducing it into his diet, the man began suffering from paranoid delusions – at one point insisting his neighbour was trying to poison him.

"In the first 24 hours of admission," physicians wrote, "he expressed increasing paranoia and auditory and visual hallucinations, which, after attempting to escape, resulted in an involuntary psychiatric hold for grave disability."

Once stabilised on medication, the patient was able to explain the full context, including the role ChatGPT played in his decision-making. Blood tests revealed he was suffering from bromism, a rare and serious condition caused by the toxic accumulation of bromide in the body.

Typically, safe levels of bromide in the blood fall below 10 mg/L. This man’s levels were a staggering 1,700 mg/L.

"It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation," experts concluded in the published case study.

The incident serves as a sobering reminder: while AI can be a powerful tool, it should not replace medical advice from trained professionals. When it comes to health, trusting a chatbot over a doctor may carry dangerous consequences.

You should also read...

How to join the indy100's free WhatsApp channel

Sign up for our free Indy100 weekly newsletter

Have your say in our news democracy. Click the upvote icon at the top of the page to help raise this article through the indy100 rankings.

The Conversation (0)