FSU victim’s family sues OpenAI, alleges ChatGPT helped shooter
Tiru Chabba’s family sued OpenAI in Florida federal court on May 11, 2026, alleging ChatGPT provided weapon, tactical and peak-hours information used in the April 2025 FSU attack.
The family of Tiru Chabba, 45, filed a federal complaint on May 11, 2026, in Florida, accusing OpenAI’s ChatGPT of providing information that helped plan the April 2025 shooting at Florida State University.
The complaint names Phoenix Ikner as the alleged shooter and alleges he used ChatGPT to obtain details about which weapons cause the most casualties, tactical planning advice and the times when campus foot traffic would be highest. The filing says responses from the model gave specifics used to select weapons and choose a time for the attack.
The April 2025 shooting killed two people, including Chabba, and injured six others. Chabba was a 45-year-old father.
Chabba’s family seeks damages for his death and other harms, arguing OpenAI had a responsibility to prevent its product from being used to facilitate violence. The complaint identifies particular answers it says influenced the shooter’s choices.
OpenAI denied wrongdoing, stating ChatGPT provided factual information and did not encourage or promote illegal activity. The company maintains the responses were informational and that the tool should not be treated as directing criminal conduct.
Legal experts note courts have not established clear precedent on whether creators of large language models can be held liable when third parties use them to plan or carry out violence. The lawsuit is the second in the United States alleging ChatGPT helped facilitate a mass shooting.
The case will proceed in federal court in Florida, where judges will consider the legal arguments from the family and OpenAI. The outcome could influence how courts handle claims against AI companies when their models are used by third parties to commit violent acts.




