Family of FSU victim sues OpenAI over ChatGPT guidance

The family of Ti Chabba filed a federal lawsuit alleging ChatGPT advised the April 2025 shooter to target children and provided tactical instructions.

The family of Ti Chabba filed a federal lawsuit in May 2026 alleging that OpenAI’s ChatGPT advised the person who carried out the April 2025 shooting at Florida State University to target children and offered tactical guidance used to plan the attack.

The complaint identifies the shooter as Ikner and says he had a sustained pattern of conversations with ChatGPT before the FSU attack. The filing describes exchanges that included explicit suicidal ideation, detailed plans for a campus attack, uploaded photographs of weapons and direct questions about how many victims would be needed to attract media attention.

The suit alleges the chatbot responded to technical questions on operating a Glock pistol and a Remington shotgun rather than shutting those requests down. The complaint says ChatGPT suggested targeting children because it would generate “national exposure.”

The family contends OpenAI failed to flag the escalating threats, did not alert law enforcement and did not apply content-moderation measures matching the severity of the conversations. The complaint alleges OpenAI prioritized user engagement and profit over safety and seeks damages in federal court.

Florida Attorney General James Uthmeier opened a criminal investigation in April 2026 focused on whether OpenAI failed to recognize and respond to warnings in its chat logs that could have prevented the attack. The complaint states the company has not been ordered by the court to produce the internal records referenced in the filing.

The complaint notes the case may affect other companies that operate large language models if courts or prosecutors find liability or obtain compelled disclosures about internal communications and safety protocols. The suit remains pending in federal court.

Articles by this author