OpenAI Accused of Pushing Stalker’s Delusion Through ChatGPT (1)

April 10, 2026, 7:57 PM UTCUpdated: April 10, 2026, 9:21 PM UTC

OpenAI Foundation was hit with a new lawsuit from a woman who says ChatGPT encouraged her ex-boyfriend’s delusions and helped him create materials to stalk and humiliate her.

The user allegedly relied on GPT-4o to create false clinical-style reports that portrayed the woman as psychologically defective, abusive, and dangerous. The stalker then spread those materials to her friends, family, colleagues, and clients, according to the anonymous complaint filed Thursday in California Superior Court.

“Because GPT-4o enabled him to produce lengthy, authoritative-seeming documents at a volume and speed that would not otherwise have been possible, the harassment was qualitatively different from ordinary harassment and far more difficult to contain,” the complaint said.

“We are reviewing the plaintiff’s filing to understand the details, and with current information, we’ve identified and suspended relevant user accounts,” an OpenAI spokesperson said in a statement. “We have continued to improve ChatGPT’s training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support. We have also continued to strengthen ChatGPT’s responses in sensitive moments, working closely with mental health clinicians.”

The complaint is part of a cluster of cases in which courts are being asked to explore whether interactions with ChatGPT are inflaming real-world violence.

Florida Attorney General James Uthmeier (R) announced Thursday that his office would be subpoenaing OpenAI after a gunman’s alleged use of ChatGPT led him to carry out a mass shooting at Florida State University. The company also is facing a lawsuit stemming from a man’s murder-suicide after the perpetrator had extensive conversations with ChatGPT.

The California lawsuit says OpenAI knew the woman’s stalker was dangerous but failed to warn anyone named in his chat logs or suspend his access to ChatGPT.

Jane Doe said OpenAI previously deactivated the man’s account after flagging him for “Mass Casualty Weapons” activity. The next day a human safety team reviewed the account’s activity—which included a conversation titled “Violence list expansion” and chat logs naming specific individuals he was targeting—before restoring the account.

Doe additionally said she submitted a notice of abuse that OpenAI acknowledged was “extremely serious and troubling,” despite not taking follow-up actions or restricting the user’s account.

The user eventually was arrested and charged with four felony counts of communicating bomb threats and assault with a deadly weapon. He is set to be released, the complaint said.

“Before his arrest, ChatGPT was exacerbating his delusions and facilitating his violent planning,” the complaint said. “When he regains access to ChatGPT that dynamic will continue and will further fuel his paranoia and materially increase the risk of harm.”

The complaint brings claims for negligence, design defect, failure to warn, and a violation of California’s Unfair Competition Law.

Doe is asking for punitive damages as well as an injunction requiring OpenAI to stop providing therapy through ChatGPT, prohibit the generation and dissemination of diagnostic-style psychological analyses of identifiable individuals, and implement safeguards against reinforcing delusional beliefs.

Doe is represented by Edelson PC.

The case is Doe v. OpenAI Found., Cal. Super. Ct., No. CGC26635725, complaint filed 4/9/26.

To contact the reporter on this story: Shweta Watwe in Washington at swatwe@bloombergindustry.com

To contact the editors responsible for this story: Laura D. Francis at lfrancis@bloombergindustry.com; Amy Lee Rosen at arosen@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.