Senator Grassley Asks Federal Judges If AI Used for Rulings (1)

Oct. 6, 2025, 6:24 PM UTCUpdated: Oct. 6, 2025, 8:59 PM UTC

Two federal judges are facing congressional inquiries for issuing orders that contained made-up citations and drove speculation they used AI tools.

Senate judiciary committee chairman Chuck Grassley (R-Iowa) on Monday sent letters to New Jersey District Judge Julien Neals and Mississippi District Judge Henry Wingate, asking if they or their staff entered confidential information into generative artificial intelligence tools to produce rulings. Grassley’s letters, which were emailed to Bloomberg Law by a representative for the senator, ramp up the scrutiny both judges face after they issued orders over the summer that contained several made-up elements.

“No less than the attorneys who appear before them, judges must be held to the highest standards of integrity, candor, and factual accuracy,” Grassley said in the letters. “Indeed, Article III judges should be held to a higher standard, given the binding force of their rulings on the rights and obligations of litigants before them.”

The judges, who have until Oct. 13 to respond to the senator, didn’t immediately return requests for comment Monday. The US Constitution allows members of Congress to remove judges after impeachment by the House of Representatives and conviction by the Senate. A Grassley representative said the senator didn’t want to pre-empt what might come out of the inquiry.

Grassley’s letters come after both judges rescinded and replaced rulings that lawyers in the cases flagged as problematic.

Judge Neals, who was nominated by President Joe Biden in 2021, has overseen notable cases such as a lawsuit brought by several states accusing Apple of anticompetitive behavior. Judge Wingate, nominated by President Ronald Reagan in 1985, was responsible for the nearly 20-year sentence handed down to Chris Epps, the former Mississippi Department of Corrections Commissioner, in 2017 on bribery charges.

Entered “In Error”

Judge Neals on June 30 issued an order that contained case quotations that didn’t exist, according to a Willkie Farr & Gallagher defense litigator who flagged the mistakes. The court on July 23 said the order was entered “in error” and issued a new order in August, which had the same effect of denying the defendants’ motion to dismiss in a case involving a biopharma company sued by its shareholders.

Meanwhile, Judge Wingate rescinded and replaced a July 20 order that referred to parties, allegations, and quotes unconnected to the case brought by a group challenging a ban on teaching concepts related to diversity, equity and inclusion. Among other made-up elements, most of the plaintiffs Judge Wingate named in the original TRO are not parties in the case, which prompted lawyers in the case to request clarification.

Judge Wingate in August denied the lawyers’ motion for clarification.

Grassley’s letters to both judges elevate the speculation that the judges used generative AI from social media chatter to the level of congressional scrutiny. The senator asked the judges if they or their staff used artificial intelligence and if they entered any privileged information into an AI tool.

“These events prompted public concern that generative artificial intelligence (“AI”) may have been used in preparing the order with little or no human verification,” Grassley wrote in the letter to Judge Wingate.

Lawyers in recent years, including multiple instances this year, have borne the brunt of criticism over AI use in litigation proceedings.

As recently as June, the Court of Appeals for the Fifth District of Texas imposed a $2,500 sanction against an attorney who filed a brief citing multiple nonexistent cases. In April, a Colorado federal judge excoriated lawyers for MyPillow CEO Mike Lindell for submitting a brief that referred to cases that didn’t exist.

Cautionary Tale

Judge Neals and Judge Wingate are unlikely to experience much more than embarrassment from the issuance of the fake citations, which will serve as a cautionary tale to members of the legal system, said Professor Stephen Gillers, Elihu Root Professor of Law Emeritus at New York University School of Law.

“I think the judges will apologize and other judges will say there but for the grace of God go I,” Gillers said. “They’ll double down on the accuracy of their citations.”

He said lawyers and judges will need to learn to work alongside generative AI tools because of how prevalent they have become among lawyers and alternative legal service providers. AI, he said, “is going to be developed for the practice of law and judicial opinions as it gets more sophisticated.”

“It’s inevitable that it will bring structural change to how law firms and judges behave,” he said.

To contact the reporter on this story: Justin Henry in Washington DC at jhenry@bloombergindustry.com

To contact the editor responsible for this story: Alessandra Rafferty at arafferty@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

Learn About Bloomberg Law

AI-powered legal analytics, workflow tools and premium legal & business news.

Already a subscriber?

Log in to keep reading or access research tools.