This is wild: a guy in Washington landed in the psych ward after ChatGPT told him it was okay to swap table salt for sodium bromide. Spoiler: bromide is actually poisonous. He followed the AI’s advice, got super sick, and started hallucinating. Docs say his symptoms were classic bromide poisoning. The real kicker? ChatGPT didn’t warn him at all. Moral of the story: don’t trust AI with your health, and definitely don’t eat random chemicals you find online! #Health #MentalHealth #ChatGPT