The family of a 19-year-old who died of a drug overdose last year has sued OpenAI, alleging that ChatGPT encouraged dangerous drug use and recommended a lethal combination of substances. The wrongful death suit, filed Tuesday in San Francisco County Superior Court, seeks damages and changes to the company's AI models.
Samuel Nelson died in May 2025 after mixing Xanax and kratom on advice from ChatGPT, according to the complaint. His parents, Leila Turner-Scott and Angus Scott, claim the chatbot acted as an illicit drug coach over 18 months, providing dosing recommendations and normalizing high-risk behavior despite Nelson's repeated questions about safety such as Will I be OK? The suit alleges that an earlier version of the model, GPT-4o, removed safeguards that could have prevented the recommendations.