A Dallas lawyer joins the growing ranks of attorneys and litigants to be sanctioned over alleged misuse of generative artificial intelligence—but not for fake case citations.
Javier Perez violated a Northern District of Texas local civil rule that mandates the disclosure of the use of generative AI programs, and Federal Rule of Civil Procedure 11, said Judge Ed Kinkeade of the US District Court for the Northern District of Texas in an opinion docketed Thursday.
“Even though artificial intelligence is a rapidly evolving technology, there have been enough repeated incidents of attorneys’ misuse of the technology that the pitfalls associated with using artificial intelligence are well-established,” Kinkeade wrote.
Perez admitted to using a popular AI chatbot to prepare portions of his client’s sworn declaration as part of responding to a summary judgment motion, the court said.
Kinkeade sanctioned Perez by requiring that he reimburse KIPP Texas, a public charter school network, for the attorneys’ fees and associated costs it incurred through its counsel’s preparation of the summary judgment reply. He also must take two hours of continuing legal education classes about AI, by the end of February.
Perez isn’t named in the opinion. But he’s the only attorney listed as representing the plaintiff, Dr. Joy Wilson, in the court docket and in Wilson’s Sept. 18 filing entitled “Plaintiff’s Notice Regarding Use of Generative Artificial Intelligence.”
Perez didn’t immediately respond to requests for comment.
In support of Wilson’s summary judgment response, she submitted an appendix that serves as her sworn declaration, Kinkeade said. In that declaration, she attempts to show similarities between her former job and the new role at issue in the case, the judge said.
Yet, “Plaintiff’s counsel admits that he used ChatGPT, a generative artificial intelligence program, in preparing portions of the Declaration, which was sworn under penalty of perjury,” Kinkeade said.
Perez’s Gen AI lawyer misuse is different than the so-called “hallucinations,” or factual inaccuracies, that scores of lawyers and pro se litigants have been accused of in recent weeks and months. In just two recent examples, federal courts in Alabama and California sanctioned two attorneys in separate cases by fining them thousands of dollars in mid-October for including nonexistent, AI-generated legal citations in their filings.
Perez violated Local Civil Rule 7.2 because, by failing to disclose he used ChatGPT, “Plaintiff’s counsel certified to the Court that he did not use generative artificial intelligence in preparing the briefing. However, that was not accurate,” Kinkeade said.
At the same time, the judge said that Perez’s use of generative artificial intelligence violated the requirements of Federal Rule of Civil Procedure 11 “because the artificially generated factual contentions lack any evidentiary support.”
While not a fake case citation scenario, he said, “this is still a case of misrepresenting facts to a court by using artificial intelligence.”
Perez “used artificially generated ‘quotations’ in his client’s sworn declaration,” Kinkeade wrote.
KIPP Texas lead counsel Paul Lamp didn’t immediately respond to a request for comment.
Spalding Nichols Lamp Langlois and Littler Mendelson PC represent KIPP Texas.
The case is Wilson v. KIPP Texas Inc., 2025 BL 389935, N.D. Tex., No. 3:24-cv-02578, docketed 10/30/25.
To contact the reporter on this story:
To contact the editor responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
See Breaking News in Context
Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.
Already a subscriber?
Log in to keep reading or access research tools and resources.