- Large language models could be a tool for judges
- Reliability and court rules remain major concerns
Law clerks and interns for federal Judge Xavier Rodriguez recently spent weeks poring over evidence from a high-profile trial on challenges to Texas’ voting and election laws, and then summarized key testimony for the court’s official findings of fact and conclusions of law.
This summer, an AI tool is doing the same thing.
While only the human-powered work will become part of the court record, Rodriguez plans to publish results on how well and how quickly an artificial intelligence tool performed the summarization and analysis compared to trained young lawyers and law students, he told Bloomberg Law.
“I am not aware of other judges doing similar experiments,” Rodriguez said in an email, “but I would be surprised if they’re not.”
A member of the Texas AI legal task force, Rodriguez is among the federal and state judges dipping their toes into how the technology can be used in and around the courtroom. The questions they’re posing were highlighted on May 28 when Judge Kevin Newsom of the US Court of Appeals for the Eleventh Circuit wrote a concurring opinion in an otherwise mainstream insurance case floating the idea that generative AI large language models could be used to help determine the “ordinary meaning” of legal text.
“Here’s the proposal, which I suspect many will reflexively condemn as heresy, but which I promise to unpack if given the chance,” Newsom wrote in a concurrence that at 29 pages was longer than the opinion. “Those, like me, who believe that ‘ordinary meaning’ is the foundational rule for the evaluation of legal texts should consider—consider—whether and how AI-powered large language models like OpenAI’s ChatGPT, Google’s Gemini, and Anthropic’s Claude might—might—inform the interpretive analysis. There, having thought the unthinkable, I’ve said the unsayable.”
Newsom’s concurrence in a case about whether installation of an in-ground trampoline and retaining wall fit the common understanding of the term “landscaping” largely focused on the possibilities and pitfalls of AI. And it marked a “starting gun” for other judges who want to experiment with those parameters in the courtroom, said Dazza Greenwood, executive director of MIT’s Computational Law Report.
“This technology exists,” said Greenwood. “It can and should be used in some ways for this.”
‘Russian Roulette With the Law’
Others with expertise on AI and the law said technological and logistical concerns must be addressed before AI becomes a judicial tool.
A big concern, as Newsom noted, is that large language models are still prone to hallucination—that is, producing outputs that lack factual grounding.
“It’s playing Russian roulette with the law,” said Stanford professor Daniel Ho, who has conducted research on legal hallucinations in large language models and the viability of using similar models for judicial interpretation.
Additionally, Ho said, large language models and similar tools—such as corpus linguistics—could make hidden choices in judicial interpretation that wouldn’t be allowed by a judge if made expressly.
“For instance, judges who are reluctant to rely on legislative history might, through reliance on a corpus, in fact have 20% of their sources be forms of legislative history,” Ho explained.
‘Have to Have Human Eyes’
Ensuring that judges and clerks who use these tools understand how they work is key, said former US Magistrate Judge Ron Hedges. Because the technology has limitations, “you have to have human eyes somewhere to verify that what was done is real,” he said.
But judges seeking to use AI tools for judicial interpretation may face pushback from the involved parties, said Jay Brudz, an attorney at Faegre Drinker Biddle & Reath LLP who focuses on AI and data issues. Existing rules around evidence and judicial notice could complicate judges’ ability to use these tools in their research.
“There are already instances of other courts prohibiting, limiting or subjecting to disclosure a lawyer’s use of the same technology on behalf of a client,” said John Bonnie, a partner at Weinberg Wheeler Hudgins Gunn & Dial who leads the firm’s insurance coverage practice group.
‘Caution and Humility’
“In his most recent year-end report on the state of the federal judiciary, Chief Justice [John] Roberts cautioned that the ‘use of AI requires caution and humility,’” wrote Newsom. “ I wholeheartedly agree. Importantly, though, I also agree with what I take to be the report’s assumption that AI is here to stay. Now, it seems to me, is the time to figure out how to use it profitably and responsibly.“
Several federal appeals courts have formed committees to examine the use of AI, and the Fifth Circuit proposed a rule regulating lawyers’ use of AI in November. The Fifth Circuit last month said it was declining to adopt the rule after some attorneys said current regulations already guard against false information that could be generated by AI tools.
Rodriguez said his use of AI is in keeping with the rules where he works, in US District Court for the Western District of Texas.
“Since I am not using the AI tool at all in drafting my FFCL, I have not sought any approval from the parties or any Administrative Office approval,” he wrote, referring to his official findings of fact and conclusions of law. “ I did seek permission to use the AI tools from our district IT office to ensure that the AI product has adequate cybersecurity measures in place.”
Rodriguez noted he is also using a separate AI tool as part of another case, this one involving Title VII, “to assist me in locating relevant citations in the record.”
Courts are slow to adopt new technology for good reason, said Louisiana state appellate court Judge Scott Schlegel, who sits on the advisory council to the national bar’s AI taskforce. There should be both judges like him, who are all in early on, and those who caution against it and try to pump the brakes, he said.
Schlegel said he applauds Newsom and his novel concurring opinion.
“Look what it’s done,” he said. “It’s given the entire world an opportunity to talk about it.”
To contact the reporter on this story:
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
See Breaking News in Context
Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.
Already a subscriber?
Log in to keep reading or access research tools and resources.