OpenAI Hit With Overdose Suit Targeting ChatGPT Drug Advice (1)

May 12, 2026, 5:19 PM UTCUpdated: May 12, 2026, 7:01 PM UTC

The family of a college student said ChatGPT caused their son’s fatal overdose after he followed medical advice about mixing substances from the chatbot, according to a newly filed lawsuit against OpenAI Foundation and Sam Altman.

On the day of the overdose, the chatbot “actively recommended” a mixture of Xanax and Kratom, according to the complaint filed Tuesday in the Superior Court of California for the County of San Francisco. The chatbot even allegedly suggested Samuel Nelson, the student, could add Benadryl to get the effect he wanted.

“If ChatGPT had been a person, it would be behind bars today. Sam trusted ChatGPT, but it not only gave him false information; it ignored the increasing risk he faced and did not actively encourage him to seek help,” Leila Turner-Scott, Sam’s mother, said in a statement.

The lawsuit is the latest in a string of litigation over the real-world harms linked to chatbot usage. Earlier lawsuits have focused on harmful mental health effects caused by from frequent chatbot interactions. More recent lawsuits have addressed the technology’s role in helping people plan and carry out mass shootings.

“This is a heartbreaking situation, and our thoughts are with the family,” a spokesperson for OpenAI said in an emailed statement about the overdose lawsuit. “These interactions took place on an earlier version of ChatGPT that is no longer available. ChatGPT is not a substitute for medical or mental health care, and we have continued to strengthen how it responds in sensitive and acute situations with input from mental health experts.”

The family is represented by the Social Media Victims Law Center and the Tech Justice Law Project. Both firms have previously brought lawsuits against chatbot makers for fostering delusions and encouraging self-harm in users.

According to the complaint, Sam had started using ChatGPT as a productivity tool in 2023 to troubleshoot computer problems, help with homework, and learn about the latest celebrity gossip.

But as he increasingly relied on and trusted the tool, Sam began asking questions about substance use, the filing said.

At first, ChatGPT refused to answer those questions or advise him on engaging in illegal or dangerous behaviors, but an updated model began coaching him through illicit substance use in 2024, the complaint alleged.

The ChatGPT-4o model regularly “assisted him to choose his next drug” and “made personalized suggestions based on the experience Sam indicated that he wanted,” the complaint said. “The model inserted emojis in its responses to Sam, asked whether it could create playlists for him to set his mood, and began pushing increasingly dangerous amounts and combinations of drugs to Sam.”

“Providing the granular, personally-tailored level of medical advice” as depicted in the complaint “ran contrary to OpenAI’s own safety metrics by which the company assesses its models, yet this regularly occurred through Sam’s ChatGPT use.” By making dosing recommendations, ChatGPT “engaged in the unlicensed practice of medicine,” the complaint added.

The complaint brought defective design, failure-to-warn, negligence, and wrongful death claims. Additionally, the complaint alleges a claim under California’s Unfair Competition Law and one under a state law provision prohibiting AI from representing itself as a licensed health practitioner.

The family is asking for damages as well as an injunction requiring OpenAI to pause healthcare-related products.

The Tech Accountability & Competition Project also represents the plaintiffs.

The case is Turner-Scott v. OpenAI Found., Cal. Super. Ct., complaint filed 5/12/26.

To contact the reporter on this story: Shweta Watwe in Washington at swatwe@bloombergindustry.com

To contact the editor responsible for this story: Kiera Geraghty at kgeraghty@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.