- ChatGPT created fabricated embezzlement lawsuit
- Output falsely said Georgia man was being sued in real case
OpenAI LLC is facing a defamation lawsuit from a Georgia radio host who claimed the viral artificial intelligence program ChatGPT generated a false legal complaint accusing him of embezzling money.
The first-of-its-kind case comes as generative AI programs face heightened scrutiny over their ability to spread misinformation and “hallucinate” false outputs, including fake legal precedent.
Mark Walters said in his Georgia state court suit that the chatbot provided the false complaint to Fred Riehl, the editor-in-chief of the gun publication AmmoLand, who was reporting on a real life legal case playing out in Washington state.
Riehl asked ChatGPT to provide a summary of Second Amendment Foundation v. Ferguson, a case in Washington federal court accusing the state’s Attorney General Bob Ferguson of abusing his power by chilling the activities of the gun rights foundation.
However, ChatGPT allegedly provided a summary of the case to Riehl that said the Second Amendment Foundation’s founder Alan Gottlieb was suing Walters for “defrauding and embezzling funds” from the foundation as chief financial officer and treasurer.
“Every statement of fact in the summary pertaining to Walters is false,” according to the defamation suit, filed on June 5.
OpenAI didn’t immediately return a request for comment.
Walters, the host of Armed America Radio, isn’t a party to the Ferguson case and has never been employed by the Second Amendment Foundation, the lawsuit said. The Second Amendment Foundation’s case “has nothing at all to do with financial accounting claims against anyone.”
The truth and reliability of AI chatbot outputs has sparked numerous controversies recently, as researchers and users uncover hallucinations—confident chatbot responses that are untrue.
An Australian mayor made headlines in April when he said he was preparing to sue OpenAI over ChatGPT outputs falsely claiming that he was imprisoned for bribery. A New York lawyer who used ChatGPT to draft legal briefs could face sanctions after he cited case law that never existed.
Riehl asked ChatGPT to provide the entire text of the Second Amendment Foundation’s complaint, and the chatbot allegedly generated “a complete fabrication” that “bears no resemblance to the actual complaint, including an erroneous case number.”
“ChatGPT’s allegations concerning Walters were false and malicious, expressed in print, writing, pictures, or signs, tending to injure Walter’s reputation and exposing him to public hatred, contempt, or ridicule,” the lawsuit said.
John Monroe Law PC represents Walters.
The case is Walters v. OpenAI LLC, Ga. Super. Ct., No. 23-A-04860-2, complaint filed 6/5/23.
To contact the reporter on this story:
To contact the editor responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
See Breaking News in Context
Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.
Already a subscriber?
Log in to keep reading or access research tools and resources.
