Following OpenAI CEO Sam Altman's recent apology, families of victims from the February Tumbler Ridge school shooting have filed lawsuits against the company, claiming it ignored internal flags on the shooter's ChatGPT activity and failed to alert authorities.
Lawyers filed six lawsuits Wednesday in San Francisco federal court on behalf of affected families, including survivor Maya Gebala, per NPR. The February 10 shooting killed five students, one teacher, the shooter's mother and half-brother, with the 18-year-old Jesse Van Rootselaar dying by suicide afterward.
One complaint alleges OpenAI's safety systems flagged Van Rootselaar's June 2025 ChatGPT use for 'gun violence activity and planning,' with the team recommending police contact—but the company only deactivated the account, enabling a new one. This echoes Altman's apology last week for not notifying authorities sooner despite suspending the account eight months prior.
An OpenAI spokesperson told Engadget: 'The events in Tumbler Ridge are a tragedy. We have a zero-tolerance policy for using our tools to assist in committing violence.' A Tuesday blog post detailed new safeguards: improved threat detection, escalation, and user distress support.
These suits follow a March lawsuit by a seriously injured girl's family and a prior wrongful death case over teen Adam Raine's 2025 ChatGPT-linked suicide, intensifying efforts to hold AI firms accountable.