- Two New York lawyers were sanctioned for using ChatGPT
- Existing standards provide base for discipline, judge says
Artificial intelligence poses ethical risks to lawyers who use those tools and could make their practice worse off, bankruptcy judges said at a conference Wednesday.
Blaming AI products for errors won’t get lawyers very far with judges, Judge Laurel Isicoff of the US Bankruptcy Court for the Southern District of Florida said. If a lawyer chooses to use AI, that lawyer is responsible for any potential consequences, she said.
“Just because you’re using a new tool doesn’t mean you have excuses,” Isicoff said.
The judge made the comments during a panel as part of an event in Puerto Rico hosted by the American Bankruptcy Institute. The panel discussion was held as the legal industry grapples with the use of AI in the practice of law.
Lawyers also shouldn’t be overly dependent on AI, Isicoff added. AI tools might be great at generating questions for a deposition, for example, but lawyers should still be reacting to what’s said during the deposition and come up with more questions, she said.
“Don’t let this make you intellectually lazy,” she said.
Two New York lawyers were sanctioned in June for submitting a ChatGPT-generated court brief that contained phony case citations for non-existent cases. The New York sanctions case shows that lawyers should check AI-generated filings for accuracy, said Mildred Cabán Flores, Chief Bankruptcy Judge for the US Bankruptcy Court for the District of Puerto Rico.
But the risks posed by AI don’t necessarily mean that courts should create new local rules to deal with them, Cabán said. Lawyers can already be sanctioned for AI use under the American Bar Association’s Model Rules of Professional Conduct, she added. Those rules cover AI because they address issues like technology use, accuracy of citations and more, she said.
If judges do create local rules, the question will be how far they go and whether they simply restrict ChatGPT and similar products, or extend the rules to AI-powered tools from companies that provide legal information, she said.
As tools continue to develop, the cost savings and efficiency benefits they provide will eventually make it “malpractice” not to use AI tools, said fellow panelist Alberto Estrella, managing member and trial lawyer at Estrella LLC.
“Our clients are expecting that we use it,” he said.
To contact the reporter on this story:
To contact the editor responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
See Breaking News in Context
Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.
Already a subscriber?
Log in to keep reading or access research tools and resources.
