Large language model chatbots such as
Google was sued last month because its chatbot, Gemini, allegedly encouraged a user to commit a mass shooting and then suicide. This comes on the heels of similar lawsuits against OpenAI and academic findings that other chatbots, such as Claude, may struggle with intermediate-risk suicidal inquiries or more nuanced health questions.
Broader Risk Potential
As chatbots become widely adopted, more cases may emerge of users engaging in acts of self-harm or violence after interacting with a chatbot. But will it end with one-off wrongful death suits? Trends in multidistrict litigation, including In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation and In re: National Prescription Opiate Litigation, demonstrate such suits may an early warning.
Although Gavalas ultimately concerns a tragic case of an individual taking his own life, the pleadings suggest the risk of broader harm to the public. According to the plaintiff, interactions with Gemini led a user to stage a mass attack in Miami. The user committed suicide after allegedly being advised by Gemini that it was the only way the two of them could achieve an “unbreakable connection.” Had such a mass casualty event occurred, mass tort, class action, or consolidated litigation would’ve been likely.
Shifting Litigation Strategy
In 2024, there were 16,725 gun-related deaths, 503 mass shootings, and 667 murder-suicides in the US. With increased adoption of LLM chatbots in 2026, future tragedies such as these could involve users communicating with chatbots before committing violence.
As Google searches are replaced with ChatGPT, investigators are looking to chatbot interactions to build a meaningful forensic trail. In February, for example, law enforcement discovered that an alleged murderer asked ChatGPT for advice on how to conceal the murder of his girlfriend.
The typical class actions against frontier AI companies involve copyright and data privacy matters. By contrast, litigation alleging that chatbots encourage self-harm using guns typically are litigated in individual wrongful death suits. The question is whether that will hold going forward.
Because gun manufacturers enjoy broad immunity from liability for crimes committed with their products under the Protection of Lawful Commerce in Arms Act, victims of such crimes need to look elsewhere for compensation. Plaintiff strategy for violence related to chatbot interactions may shift toward class actions, multidistrict litigation, and suits brought on behalf of municipalities against frontier AI companies.
Opioid Suit Model
In re: National Prescription Opiate Litigation is a reminder of how these trends can develop. In the late-1990s, the use of opioids to treat pain was reaching wide acceptance and, by 2012, prescribing rates peaked to approximately 81.3 prescriptions per 100 persons. Even prescient prosecutors’ offices, such as the New York’s Office of the Special Narcotics Prosecutor, targeted pill mills, drug trafficking rings, and corrupt medical practitioners, rather than manufacturers and distributors.
Plaintiffs who attempted consolidated product liability actions faced appellate decisions decimating the size of their class. In 2012, it seemed opioids were an accepted feature of modern medicine: Any fault rested with the doctors for prescribing poorly, and procedural hurdles stood in the way of successful class litigation. It would’ve seemed unlikely that MDLs focused on public nuisance claims would be successful against opioid manufacturers and distributors.
But the trend shifted by 2017, when 64 cases were consolidated. Since then, more than 3,000 cases have been brought by states, local governments, Native American tribes, and other entities against opioid manufacturers, distributors, and pharmacies often alleging public nuisance to get around evidentiary issues associated with product liability claims. Since 2018, defendants in those actions, including Purdue Pharma, Mallinckrodt, Endo International, Rite Aid, and Insys, have all filed for bankruptcy.
In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation demonstrates that opioids aren’t an outlier. In 2012, social media was widely adopted and beloved by investors. Facebook (now Meta Platforms) launched its initial public offering, one of the largest IPOs in history. But by 2016, academic literature began to raise concerns about social media addiction among adolescents. A 2021 whistleblower campaign intensified concerns. By 2022, a multidistrict litigation was formed in the Northern District of California. As of April 1, there were 2,634 cases brought in the MDL related to social media addiction.
Today, leadership at frontier AI companies may believe that their companies are protected by legal armor. Immunity under Section 230, proximate causation, and other defenses represent significant hurdles to potential plaintiffs. The federal government also has declared AI as “transformative” and “a national security imperative for the US to achieve and maintain unquestioned and unchallenged global technological dominance.”
The federal government was similarly bullish about opioids. In 2013, The Drug Enforcement Administration approved a record quota of 153,750,000 grams of oxycodone for sale, before reversing course only a few years later and implementing a sharp reduction in approved quotas. Defenses that opioid defendants believed were strong, including federal preemption to state tort claims, ultimately gained little traction.
AI companies should take a close look at the trajectory of opioid litigation. An industry that appeared favored by government and capable of managing its liability became embroiled in public controversy and extensive litigation. Over the course of a few years, major companies were driven to insolvency.
Suicide and violence existed before the advent of chatbots. But drug overdoses also existed before oxycodone was synthesized. As with the harm and deaths from opioid addictions, there is very likely little sympathy across the American public for AI chatbots that are advocating for violence and self-harm. And as with the opioid industry, change in political sentiment and significant legal ramifications may follow.
This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law, Bloomberg Tax, and Bloomberg Government, or its owners.
Author Information
Hayden A. Miller is a partner in Brown Rudnick’s litigation and dispute resolution practice group.
Write for Us: Author Guidelines
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
See Breaking News in Context
Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.
Already a subscriber?
Log in to keep reading or access research tools and resources.
