
As 12-year-old Tumbler Ridge shooting victim Maya Gebala clings to life in a Vancouver hospital, her parents talk about their agonizing ordeal and why they’re so convinced she’ll pull through.
A family affected by the devastating shooting in Tumbler Ridge, British Columbia, has filed a lawsuit against OpenAI, alleging the company failed to alert authorities despite warning signs in conversations with its chatbot, ChatGPT.
The case was filed Monday in B.C. Supreme Court by the mother of Maya Gebala, a 12-year-old who was critically injured during the Feb. 10 attack. The lawsuit claims the tech company ignored alarming prompts from the shooter that suggested plans for a mass casualty event.
None of the allegations have been proven in court, and OpenAI has not yet publicly responded to the claims.
Victim Still Recovering From Severe Injuries
According to the lawsuit, Maya Gebala was shot three times during the attack. The shooting killed eight people, along with the 18-year-old gunman.
The court filing says Maya suffered a catastrophic brain injury that has left her with permanent physical and cognitive disabilities. She remains hospitalized.
Her younger sister is also listed as a plaintiff. The claim says she experienced severe psychological trauma after being placed in a hold-and-secure situation during the attack.
The lawsuit states that the child now struggles with post-traumatic stress disorder, anxiety, and depression. Their mother, Cia Edmonds, is also listed as a plaintiff due to the emotional and mental toll of the tragedy.
Allegations About ChatGPT Conversations
The lawsuit alleges the shooter, Jesse Van Rootselaar, interacted extensively with ChatGPT before the attack.
According to the claim, those conversations suggested long-term planning for a mass casualty event. The lawsuit argues that OpenAI either knew, or should have known, that the user posed a serious risk.
The filing claims the chatbot effectively acted as a trusted confidant for the shooter.
It describes the system as serving roles similar to a counsellor, therapist, and emotional support figure.
The lawsuit further alleges that OpenAI designed ChatGPT in ways that could foster psychological dependence by displaying human-like empathy and affirmation.
Claims About Age and Safety Measures
The claim also states that the shooter began using ChatGPT before turning 18.
Although OpenAI requires users between 13 and 18 to have parental consent, the lawsuit alleges that the company failed to implement meaningful age verification systems.
According to the plaintiffs, the company prioritized engagement and growth over safety safeguards.
The lawsuit also claims OpenAI internally suspended the shooter’s account but did not notify law enforcement about the potential threat.
A man lays flowers at a memorial to the victims of the Tumbler Ridge Secondary School shooting in Tumbler Ridge, British Columbia, on Thursday, Feb. 12, 2026.
Political Pressure and Government Meetings
The case has already drawn political attention.
OpenAI CEO Sam Altman recently held a virtual meeting with British Columbia Premier David Eby and Tumbler Ridge Mayor Darryl Krakowka.
After the meeting, Eby said Altman agreed to apologize to the Tumbler Ridge community and work with the provincial government on recommendations for AI regulation.
Federal Artificial Intelligence Minister Evan Solomon also met with Altman separately.
Solomon said OpenAI has promised to cooperate more closely with Canadian authorities in the future.
Changes to Company Policy
Officials say OpenAI has since revised internal policies following the shooting.
The company reportedly lowered the threshold for notifying law enforcement when conversations suggest credible threats.
Meanwhile, authorities have announced a coroner’s inquest into the shooting. The investigation will examine several factors, including whether artificial intelligence played any role in the tragedy.
No date has yet been set for the inquiry.

