Home / Crime and Justice / AI Firm Sued Over Shooter's ChatGPT Account
AI Firm Sued Over Shooter's ChatGPT Account
11 Mar
Summary
- OpenAI account suspended before mass shooting.
- Lawsuit claims AI company ignored violent intentions.
- Victim's family seeks truth about the attack.

A lawsuit has been filed against OpenAI by the family of Maya Gebala, a student injured in a mass shooting in British Columbia. The family accuses the artificial intelligence company of negligence for not alerting authorities about disturbing content found in the shooter's ChatGPT account.
On February 10th, 18-year-old Jesse Van Rootselaar fatally shot her mother, brother, five students, and an educator before taking her own life. Maya Gebala was critically wounded and remains hospitalized in Vancouver, undergoing multiple brain surgeries.
Eight months prior to the attack, OpenAI suspended an account associated with Van Rootselaar for violating user agreements, noting her documented fascination with violence. The lawsuit claims OpenAI was aware of her violent intentions and use of its AI for planning scenarios, including mass casualty events.




