Families of the victims of the Tumbler Ridge mass shooting have rejected an apology from OpenAI CEO Sam Altman and launched a lawsuit against the company, alleging that its failure to alert authorities contributed to the tragedy. The shooting, which occurred on February 10, 2026, left eight people dead and one critically injured.
The Shooting and OpenAI's Involvement
On that day, 18-year-old Jesse van Rootselaar killed her mother and half-brother before going on a shooting spree at the local high school, killing five students and a teacher. The shooter then took her own life. However, months before the attack, van Rootselaar had been using OpenAI's ChatGPT to access information about mass shootings. Internal company staff detected the activity and flagged the account.
Last week, Altman sent a formal apology letter to the community of Tumbler Ridge, a coal-mining-and-tourism town of 2,000 in northeastern British Columbia. He admitted that the company knew about the flagged account but decided not to warn the Royal Canadian Mounted Police (RCMP).
“I am deeply sorry that we did not alert law enforcement,” Altman wrote. “While I know words can never be enough, I believe an apology is necessary to recognize the harm and irreversible loss your community has suffered.”
Families Respond with Lawsuit
Now, the families of the victims are suing Altman and OpenAI in San Francisco Court. They allege that Altman went against the advice of his own staff and chose not to bring the information to police. Cia Edmonds, mother of 12-year-old Maya Gebala—who was shot in the head and neck and remains hospitalized—authored a scathing response to Altman's apology.
“The stakes could not have been higher when 12 of your employees advocated to contact Canadian authorities,” Edmonds wrote. “What could possibly have been so bad for your profit margins if you just picked up the phone and made a short phone call? Were you worried about your public image? Would losing the illusion of anonymity with your users cause a possible decline to your bottom line? Only you know the answers to these questions.”
She continued: “Did you use ChatGPT to draft your ‘apology,’ Sam? It is empty, soulless, and lacks any human warmth. Only a machine could have put those words together and called it an apology. You say the worst thing in the world is losing a child. Do you know what is worse, Sam? There are parents who never got to say goodbye to their children. Families are sitting at their kitchen tables with empty chairs that will never be filled. Down the hall, that one bedroom door remains closed, it is just too painful. The empty shoes at the doorway, the missing sounds of laughter and the hollow remnants of a human soul that no longer exists.”
Lori Hayer, the mother of victim Zoey Renee Anne Benoit, filed a statement of claim asserting that OpenAI was culpable because its platform allows users to engage in conversations about extreme violence and because the company withheld information.
“In the weeks that followed the attack, a sickening truth emerged: ChatGPT played a role in the mass shooting and OpenAI could have, and should have, prevented it,” reads Hayer’s statement of claim. “Sadly, the victims didn’t learn this because OpenAI was forthcoming, but because its own employees leaked it to the Wall Street Journal after they could no longer stomach the company’s silence.”
The lawsuit seeks damages for the families, alleging negligence and failure to act on red flags. The case highlights the growing concerns over the role of artificial intelligence in facilitating violence and the responsibilities of tech companies to prevent harm.



