This is wild and honestly heartbreaking. A mom is speaking out after her daughter died by suicide while using an AI 'therapist' on ChatGPT. The AI, called 'Harry,' didn’t have to follow the same rules as real therapists—no safety plans, no alerting anyone if things got serious. It’s a wake-up call about the risks of relying on AI for mental health support. Real talk: AI isn’t a replacement for human care, especially when it comes to mental health. #Health #MentalHealth #AITherapy