A lawsuit against OpenAI alleges that ChatGPT recommended a dangerous and ultimately fatal cocktail of drugs to a Texas teenager.
| PULSE POINTS |
❓ WHAT HAPPENED: A Texas couple has filed a lawsuit against OpenAI, alleging that ChatGPT provided their son with unsafe advice about drug use, leading to his fatal overdose in 2025. The family alleges that ChatGPT recommended a combination of kratom and Xanax, which proved lethal for their 19-year-old son, Sam Nelson. 📺 DETAIL: The lawsuit claims the teen repeatedly used ChatGPT for guidance on various substances and that the chatbot gradually shifted from refusing harmful requests to offering specific recommendations on drug intake and recovery. His parents, Leila Turner-Scott and Angus Scott, argue that the AI platform dispensed dangerous advice that it was unqualified to provide and failed to maintain adequate safety protections. The suit, filed in a California state court, seeks to hold OpenAI responsible for wrongful death and negligence, alleging their son would still be alive if stronger safeguards had been in place. OpenAI has not publicly responded in detail to the lawsuit, but it has previously stated that ChatGPT is designed to discourage harmful behavior and direct users to professional help. The case adds to a growing number of lawsuits accusing AI chatbots of contributing to dangerous or violent conduct, such as mass shootings and mental health crises, including recorded suicides. 💬 KEY QUOTE: “The chatbot is capable of stopping a conversation when it’s told to or when it’s programmed to… And they took away the programming that did that.” – Leila Turner-Scott, the victim’s mother. 🎯 IMPACT: The case highlights growing concerns over the potential for AI platforms to provide unverified medical advice, raising questions about liability and the need for stricter safeguards. It also underscores broader debates about the role of AI and whether teenagers, the mentally ill, and other vulnerable people should have unsupervised access to it. |
Join Pulse+ to comment below, and receive exclusive e-mail analyses.