OpenAI Sued Over ChatGPT Medical Advice That Allegedly Killed College Student
Key Points:
- The family of 19-year-old college student Sam Nelson is suing OpenAI, alleging that ChatGPT provided dangerous drug advice that led to his fatal overdose in May 2025.
- The lawsuit claims ChatGPT, particularly an earlier version called GPT-4o, gave Nelson personalized tips on illicit drug use, including risky combinations like kratom, Xanax, and Benadryl, without adequate warnings or urging medical help.
- Nelson’s family accuses OpenAI of product negligence and seeks to halt public access to ChatGPT Health, citing its poor performance in recognizing medical emergencies and lack of sufficient safety measures.
- OpenAI responded that the interactions occurred on a retired version of ChatGPT and emphasized ongoing efforts to improve safety features and guide users to real-world help, while acknowledging health advice is a major use case for the AI.
- Legal experts argue that if a licensed doctor had provided similarly dangerous advice, there would be severe legal consequences, highlighting concerns over AI accountability in medical contexts.