California Bar Says Contractor Used ChatGPT for Exam Prompts (3)

April 30, 2025, 6:24 PM UTCUpdated: May 1, 2025, 4:52 PM UTC

A California State Bar test consultant used OpenAI’s ChatGPT to develop 29 of the February Bar Exam’s 200 multiple choice questions, according to a petition filed late Tuesday with the state’s Supreme Court.

ACS Ventures Inc., which the State Bar contracts for test analysis and scoring, “drafted prompts to yield multiple-choice questions that aligned with the topic areas identified by Admissions’ staff and ran the prompts through OpenAI ChatGPT,” the petition said.

The Bar’s petition to the Supreme Court was a day later than planned, because justices asked for an explanation of AI’s role in exam development. The justices said they didn’t know AI had been used until the Bar mentioned it in an April 21 news release. The Bar is still pressing the court to approve its scoring methods so exam results can be released around May 2.

Applicants are still reeling months after the February exam—developed specifically for California to save money for the Bar’s admissions fund—glitched and crashed repeatedly, preventing many from finishing the exam and prompting litigation and investigations. The State Bar wants the justices to approve a scoring adjustment that it expects would bring pass rates in line with prior exams, but concerns about the validity of the questions themselves have cast doubt on the Bar’s efforts.

University of San Francisco School of Law Professor Katie Moran said in an email that the petition leaves the court’s questions about AI use unanswered.

“The petition does not say what steps the drafter and the state bar took to protect examinees,” Moran said. “It does not say whether the state bar assured steps were taken to prevent copyright infringement issues with NCBE materials when using ChatGPT. It does not say why open source AI was used as opposed to a proprietary AI source. It does not say whether anyone at the State Bar was aware that AI was used.”

State Sen. Tom Umberg (D), who has proposed launching an audit of the exam, said in an interview his reaction to the new petition is “one of sadness and one of great frustration.”

Umberg, who has called for a return to the previous Bar Exam format using National Conference of Bar Examiners in July, said he’ll press during a hearing with Bar officials Tuesday for details on who is responsible for the use of AI, alongside questions about what happened during the exam’s rollout and what lessons the State Bar learned.

“This is a core responsibility of the State Bar, and the fact that they basically have failed in this core responsibility is absolutely unacceptable,” Umberg said.

The State Bar, ACS Ventures, and the Committee of Bar Examiners didn’t immediately respond to a request for comment.

Sourcing Exam Questions

The original plan was to source questions only from Kaplan NA LLC, with whom the Bar had recently approved an $8.25 million, five-year contract, and the “baby bar,” according to the petition.

In October, it became clear those two sources wouldn’t develop enough multiple choice questions to cover all the required subject areas, the Bar wrote.

“Our agreement all along was for a subset of multiple choice questions,” Kaplan spokesperson Russell Schaffer said in an email. “Regarding question accuracy, the CA State Bar commissioned an extensive independent review of the validity and reliability of our questions, and our questions were found to perform well above the expected threshold.”

Initial data suggests “that the examination appropriately measured minimum competence and that February 2025 test takers performed in alignment with or better than prior February bar examination administrations,” the petition said.

Six questions were posed to applicants as part of the exam despite an expert determining they had “potential accuracy concerns” because they had already been uploaded to the Meazure Learning testing platform, a footnote said. They were removed from scoring.

The State Bar expected to remove around 25 multiple choice questions from scoring, as was custom when it used the multistate bar exam, the petition said. Ultimately, 29 questions were left out of grading.

After analyzing reliability, reviewers tossed six of the 29 AI-assisted ACS questions, the petition said. In contrast, 17 of 117 questions developed by Kaplan and six of the 54 questions recycled from the first year law students exam, also known as the “baby bar,” were discarded.

ACS was first pulled into the question development process in September 2024, when Bar admissions staff asked the contractor to develop just over a quarter of the questions used in the November experimental multiple choice test, the petition said.

“The decisions by Admissions staff to request that ACS develop questions for the November bar examination study and for use on the February 2025 bar examination were not clearly communicated to State Bar leadership,” the petition said in a footnote. “Structural changes within Admissions have been made to address this issue.”

Of the 14 ACS-drafted questions, 11 were pulled from the experimental November exam for use in February because they were deemed top-performing, the petition said.

“I’m not saying that you can’t, in theory, use AI, right? Advanced AI probably could write exams,” said Zack Defazio-Farrell, a February exam taker. “I just think that if you’re going to do it, you should be telling us ahead of time.”

He added, “As attorneys, we can’t use AI to file anything in court. We could be disbarred for that. So, they can get away with it, but we can’t.”

To contact the reporter on this story: Maia Spoto in Los Angeles at mspoto@bloombergindustry.com

To contact the editor responsible for this story: Stephanie Gleason at sgleason@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

Learn About Bloomberg Law

AI-powered legal analytics, workflow tools and premium legal & business news.

Already a subscriber?

Log in to keep reading or access research tools.