
OpenAI Sued in US Court by Family of Florida University Shooting Victim
Lawsuit alleges ChatGPT helped the gunman plan the attack and failed to flag violent conversations to authorities.
The family of a man killed in the 2025 mass shooting at Florida State University has filed a lawsuit against OpenAI in a US federal court, alleging that the gunman used ChatGPT to help plan the attack.
The family of Tiru Chabba filed the lawsuit on Sunday in a federal court in Florida against the company and the man accused of carrying out the shooting, Phoenix Ikner. It is believed to be at least the second lawsuit in the United States accusing OpenAI of facilitating a mass shooting.
According to the complaint, ChatGPT acted as a “co-conspirator” in the attack because Ikner allegedly used information provided by the chatbot over several months to plan and carry out the shooting. The lawsuit claims the conversations included discussions about mass shootings, the lethality of weapons and the busiest times at the Florida State University student union, yet the chatbot neither flagged nor escalated the interactions.
The family is seeking compensatory and punitive damages, accusing OpenAI of designing a defective product and failing to warn the public about potential risks associated with its AI systems.
An OpenAI spokesperson, Drew Pusateri, rejected the allegations, saying the company was not responsible for the attack. He said ChatGPT provided factual responses based on information widely available on the internet and did not encourage or promote illegal or harmful activity.
Pusateri added that the company identified an account believed to be linked to the suspect after the shooting and proactively shared the information with law enforcement. He said OpenAI continues to cooperate with investigators and is working to improve its systems for detecting harmful intent.
Authorities said Ikner, the son of a deputy sheriff, killed two people and injured four others at the university in Tallahassee, Florida, before being shot by police officers and taken to hospital. Court records show he faces two counts of first-degree murder and seven counts of attempted first-degree murder.
A lawyer representing Ikner did not immediately respond to requests for comment.
In April, James Uthmeier announced a criminal investigation into ChatGPT’s alleged role in the shooting after prosecutors reviewed chat logs between Ikner and the AI programme.
OpenAI has previously said it trains its AI models to refuse requests that could “meaningfully enable violence” and alerts law enforcement when conversations indicate “an imminent and credible risk of harm”, with mental health experts involved in assessing borderline cases.
The company is among several AI firms facing a growing number of lawsuits alleging that chatbot interactions contributed to self-harm, mental illness and violent acts. Last month, relatives of victims of one of Canada’s deadliest mass shootings also filed lawsuits against OpenAI and its chief executive, Sam Altman, claiming the company knew months in advance that the suspect was allegedly planning the attack using ChatGPT but failed to alert authorities.
For any enquiries or information, contact ask@tlr.ae or call us on +971 52 644 3004. Follow The Law Reporters on WhatsApp Channels.