OpenAI Sued by Families of Canada School Shooting Victims (1)

April 29, 2026, 1:42 PM UTC

OpenAI is the target of new lawsuits over the mass shooting in Tumbler Ridge, British Columbia, that allege the artificial intelligence company could have stopped the suspected killer from using its popular chatbot, ChatGPT, ahead of the attack.

One of the cases, which were filed Wednesday in federal court in San Francisco against OpenAI and its chief executive officer, Sam Altman, was brought by a 12-year-old, who was shot during the incident and remains in intensive care, and her mother. Another lawsuit was brought by the mother of a girl killed in the shooting.

According to the lawsuits, OpenAI knew that Jesse Van Rootselaar, who was identified as the chief suspect behind the massacre in February at Tumbler Ridge Secondary School, was planning the attack due to the shooter’s ChatGPT use, but made a “conscious decision not to warn authorities.”

People pay their respects at a community vigil in Tumbler Ridge
Photographer: Paige Taylor White/AFP/Getty Images

“ChatGPT played a role in the mass shooting and OpenAI could have, and should have, prevented it,” according to the complaints, which allege the startup wanted to avoid having to contact police each time OpenAI’s safety team spotted a ChatGPT user planning to carry out a violent act.

“The events in Tumbler Ridge are a tragedy. We have a zero-tolerance policy for using our tools to assist in committing violence,” OpenAI said in a statement. “As we shared with Canadian officials, we have already strengthened our safeguards, including improving how ChatGPT responds to signs of distress, connecting people with local support and mental health resources, strengthening how we assess and escalate potential threats of violence, and improving detection of repeat policy violators.”

A series of suits have been filed so far against chatbot makers since 2024, most of them targeting OpenAI and ChatGPT. Most of the suits allege that extensive use of the technology has inflicted a range of harms on children and adults alike, fostering delusions and despair for some and leading others to death by suicide and even murder-suicide.

On Feb. 10, Van Rootselaar allegedly carried out the mass shooting in northeastern British Columbia, killing eight people — including her mother and stepbrother, along with six others at the school, five of whom were children, and injuring more than two dozen others. Van Rootselaar, 18, was found dead after the shooting from what appeared to be a self-inflicted wound.

The middle school and high school building where a shooting took place in Tumbler Ridge.
Photographer: Eagle Vision Agency/AFP/Getty Images

In the wake of the shooting, OpenAI said it banned Van Rootselaar for violating its ChatGPT usage policy last June. Her account was flagged at the time for messages deemed to have potential for violence, but OpenAI did not alert police. The Wall Street Journal first reported on OpenAI’s decision, saying concerned employees urged the startup to report the situation to authorities.

Later in February, OpenAI revealed that the suspected killer created a second ChatGPT account it did not spot until her name was released by police; OpenAI told Canadian lawmakers that, under newly updated company rules, it would have referred Van Rootselaar to police.

Last week, Altman wrote in a letter published by Tumbler RidgeLines, a local news site, that he wanted to express his “deepest condolences to the entire community.”

“I am deeply sorry that we did not alert law enforcement to the account that was banned in June,” Altman wrote. “While I know words can never be enough, I believe an apology is necessary to recognize the harm and irreversible loss your community has suffered.”

Read More: Canadian Province Says It Will Ban Social Media, AI For Youth

The lawsuits come at a sensitive time for OpenAI, which is eyeing a much-anticipated public offering that’s poised to be one of the largest in history as the company approaches a trillion-dollar valuation.

OpenAI is also trying to fend off claims by Elon Musk that it abandoned its founding mission as a nonprofit when it restructured last year as a for-profit entity. At a trial in California that started this week, Musk may ask a judge to order the conversion to be unwound.

(Updated with statement from OpenAI in fifth paragraph.)

--With assistance from Erik Larson and Thomas Seal.

To contact the reporter on this story:
Rachel Metz in San Francisco at rmetz17@bloomberg.net

To contact the editors responsible for this story:
Seth Fiegerman at sfiegerman@bloomberg.net

Peter Blumberg, Ben Bain

© 2026 Bloomberg L.P. All rights reserved. Used with permission.

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.