OpenAI CEO Sam Altman APOLOGIZES to Canadians for not alerting authorities over Tumbler Ridge trans mass killer’s account

OpenAI CEO Sam Altman has apologized to the people of Tumbler Ridge in British Columbia, Canada, after it was revealed that a trans mass shooter used ChatGPT to ask questions about carrying out mass shootings. ChatGPT did not report the conversations to police or that the account was banned.
“I am deeply sorry that we did not alert law enforcement to the account that was banned in June,” Altman said. “While I know words can never be enough, I believe an apology is necessary to recognize the harm and (irreversible) loss your community has suffered.”
“I want to express my deepest consolences to the entire community,” said Altman, per Tumbler Ridge Lines. “No one should ever have to endure a tragedy like this. I cannot imagine anything worse in the world than losing a child.”
The family of Maya Gebala, who was injured in the attack while trying to protect other students, announced that they are suing the AI company for not reporting the account to the authorities back when they discovered the violent chat history. Other students were killed.
Jesse Van Rootselaar killed his mother and half-brother at the family’s home on February 10 before moving on to the local high school and opening fire there. OpenAI later told the RCMP that VanRootselaar’s ChatGPT account was banned due to “violent” activity.
Altman said he “reaffirm[s] the commitment [he] made to the Mayor and the Premiere to find ways to prevent tragedies like this in the future. Going forward, out focus will continue to be on working with all levels of government to help ensure something like this never happens again.”
ChatGPT is also facing a law suit in Florida after it was revealed that a man who opened fire at Florida State University discussed his plans withe ChatGPT. “We have been advised that the shooter was in constant communication with ChatGPT leading up to the shooting,” said attorneys Ryan Hobbs and Dean LeBoeuf representing the family of the murdered in that case. Over 270 AI photos and ChatGPT conversations are listed as exhibits in the case.
“We also have reason to believe that ChatGPT may have advised the shooter how to commit these heinous crimes. We will therefore file suit against ChatGPT, and its ownership structure, very soon, and will seek to hold them accountable for the untimely and senseless death of our client, Mr. Morales.”
editor's pick
latest video
news via inbox
Nulla turp dis cursus. Integer liberos euismod pretium faucibua


