- Vendor is promised up to $502,000 in 2024, 2025 contracts
- ACS Ventures contracts don’t explicitly consider AI use
The independent data scientist that used ChatGPT to develop multiple choice questions for the new California State Bar exam was promised payment of more than half a million dollars for its work under contracts obtained by Bloomberg Law, which include strict privacy provisions and prohibit third-party outsourcing.
As the Bar’s longtime psychometrician comes under scrutiny for using AI to generate 29 questions for the February exam, the contracts between ACS Ventures and the California State Bar raise questions about whether its use of the OpenAI chatbot was prohibited.
The February Bar crashed widely and threw thousands of test takers’ careers into jeopardy, and the Bar’s reveal of the use of AI ignited more controversy. The Bar said last week it erroneously informed a few applicants they failed the exam when, in fact, they passed. And test takers and exam experts are still raising concerns about the vetting of the exam’s multiple choice questions.
The contracts don’t explicitly contemplate the use of AI.
They do require that ACS “will not assign, subcontract, delegate or otherwise transfer any of the rights, duties or obligations of this Agreement to any third party without the prior written consent of the State Bar and compliance with the requirements as set forth below.”
University of San Francisco Law Professor Nicole Phillips said she’s optimistic about using AI to create practice Bar questions to aid in studying, for example, but “if you’re not using an AI source that you own and that you control the privacy settings of, it’s definitely a third party, because they’re keeping every bit of information you’re entering.”
Using AI without clear safeguards on the February exam is “a real disappointment, because I think it’s going to scare a lot of people away from the efficient, effective use of it,” she said.
ACS is promised up to $502,000 across four work orders for 2024 and 2025. The work is mostly vague, sometimes described as “ad hoc,” but generally contemplates the production of reports, analysis, and grading of the CA Bar and first-year exams.
The contracts are specific when it comes to confidentiality. ACS was required to hold applicant data in “strictest confidence,” restricting who can access information and where.
For example, without written permission otherwise, the contract says ACS can only work with test takers’ information at a State Bar office or in co-founder Chad Buckendahl’s Las Vegas home.
“It seems like here, there were a couple steps missed from entering the prompt to printing the test,” Phillips said.
In response to request for comment the State Bar said that AI use was “was inappropriate...without levels of transparency and policy-level decision making. Structural changes have been made within Admissions to ensure this will not happen again.”
It added that the contract with ACS “does not contemplate the creation of questions” but that ACS was asked to develop questions to ensure there were enough questions in all subtopic subject areas.
ACS didn’t respond to requests for comment.
Additional Services, TBD
As the Bar’s independent psychometrician, ACS is tasked with ensuring exam questions are valid and reliable and handling other data science issues.
Buckendahl, who has a PhD and a master’s in legal studies but isn’t a lawyer, has been working with the Bar since at least 2013.
Leading up to the February Bar exam, ACS agreed to do psychometric work and statistical analysis as well as “additional services, to be determined, as needed.”
When the Bar was in need of questions for its brand-new exam in fall 2024, admissions staff asked ACS to generate some material to fill in the gaps, according to the petition the Bar filed in April with the California Supreme Court. Bar leadership didn’t know about the request until later.
“ACS performed an initial edit on each draft item to ensure the questions had a standardized structure, but did not review for content accuracy, bias, or to determine whether the question was appropriate for entry-level attorneys,” the Bar’s petition said, adding the questions were reviewed instead by the Bar’s content validation panels.
The contracts are hazy on the exact details of the Bar’s work for the February exam, said University of San Francisco Law Professor Katie Moran, who reviewed all the contracts.
One ACS contract, entered into in March 2025, gives the most detailed explanation of ACS’ role in data analysis for the February and July 2025 tests. But it again references a nebulous “exam development” task and says ACS will provide an analysis of the development of multiple choice questions by Kaplan NA LLC, which provided the majority of those questions in February.
It’s also the most expensive contract, outlining up to $204,000 in payments.
Moran has been sounding alarms over the fact that ACS, which checks exam questions, also wrote dozens of its own.
The Bar said all the multiple choice questions performed well statistically, but ACS questions performed “less well than those drafted by” other providers on some measures. That relative performance “negates inferences of bias,” the Bar said.
All the Bar’s recent contracts with ACS contain conflict of interest provisions, barring ACS from having a business or financial stake in its work beyond receiving payment under the contract.
“I’d argue that if you’re writing your own questions and then approving them, that does create a conflict of interest,” Moran said.
To contact the reporter on this story:
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.