Advertisement

California Supreme Court demands State Bar answer questions on AI exam controversy

A man enters the State Bar of California building
The California Supreme Court has urged the State Bar of California to explain why it used artificial intelligence to develop multiple-choice questions for its February bar exams.
(Jae C. Hong / Associated Press)

The California Supreme Court urged the State Bar of California on Thursday to explain how and why it utilized artificial intelligence to develop multiple-choice questions for its botched February bar exams.

California’s highest court, which oversees the State Bar, disclosed Tuesday that its justices were not informed before the exam that the State Bar had allowed its independent psychometrician to use AI to develop a small subset of questions.

The Court on Thursday upped its public pressure on the State Bar, demanding it explain how it used AI to develop questions — and what actions it took to ensure the reliability of the questions.

Advertisement

The demand comes as the State Bar petitions the court to adjust test scores for hundreds of prospective California lawyers who complained of multiple technical problems and irregularities during the February exams.

Using AI-developed questions written by non-legally-trained psychometricians represents ‘an obvious conflict of interest,’ critics say.

The controversy is about more than the State Bar’s use of artificial intelligence per se. It’s about how the State Bar used AI to develop questions — and how rigorous its vetting process was — for a high-stakes exam that determines whether thousands of aspiring attorneys can practice law in California each year.

It also raises questions about how transparent State Bar officials were as they sought to ditch the National Conference of Bar Examiners’ Multistate Bar Examination — a system used by most states — and roll out a new hybrid model of in-person and remote testing in an effort to cut costs.

Advertisement

In a statement Thursday, the Supreme Court said it was seeking answers as to “how and why AI was used to draft, revise, or otherwise develop certain multiple-choice questions, efforts taken to ensure the reliability of the AI-assisted multiple-choice questions before they were administered, the reliability of the AI-assisted multiple-choice questions, whether any multiple-choice questions were removed from scoring because they were determined to be unreliable, and the reliability of the remaining multiple-choice questions used for scoring.”

Last year, the court approved the State Bar’s plan to forge an $8.25-million, five-year deal with Kaplan to create 200 test questions for a new exam. The State Bar also hired a separate company, Meazure Learning, to administer the exam.

It was not until this week — nearly two months after the exam — that the State Bar revealed in a news release that it had deviated from its plan to use Kaplan Exam Services to write all the multiple-choice questions.

Advertisement

In a presentation, the State Bar revealed that 100 of the 171 scored multiple-choice questions were made by Kaplan and 48 were drawn from a first-year law students’ exam. A smaller subset of 23 scored questions was made by ACS Ventures, the State Bar’s psychometrician, and developed with artificial intelligence.

“We have confidence in the validity of the [multiple-choice questions] to accurately and fairly assess the legal competence of test-takers,” Leah Wilson, the State Bar’s executive director, said in a statement.

Trump’s release of the files is backed by the senator’s son Health and Human Services Secretary Robert F. Kennedy Jr., who has long believed there may have been a second gunman.

Alex Chan, an attorney who chairs the Committee of Bar Examiners, which exercises oversight over the California Bar Examination, told The Times on Tuesday that only a small subset of questions used AI — and not necessarily to create the questions.

Chan also noted that the California Supreme Court urged the State Bar in October to review “the availability of any new technologies, such as artificial intelligence, that might innovate and improve upon the reliability and cost-effectiveness of such testing.”

“The court has given its guidance to consider the use of AI, and that’s exactly what we’re going to do,” Chan said.

That process, Chan later explained, would be subject to the court’s review and approval.

On Thursday, Chan revealed to The Times that State Bar officials had not told the Committee of Bar Examiners ahead of the exams that it planned to use AI.

Advertisement

“The Committee was never informed about the use of AI before the exam took place, so it could not have considered, much less endorsed, its use,” Chan said.

Katie Moran, an associate professor at the University of San Francisco School of Law who specializes in bar exam preparation, said this raised a series of questions.

“Who at the State Bar directed ACS Ventures, a psychometric company with no background in writing bar exam questions, to author multiple-choice questions that would appear on the bar exam?” she said on LinkedIn. “What guidelines, if any, did the State Bar provide?”

Mary Basick, assistant dean of academic skills at UC Irvine Law School, said it was a big deal that the changes in how the State Bar drafted its questions were not approved by the Committee of Bar Examiners or the California Supreme Court.

“What they approved was a multiple-choice exam with Kaplan-drafted questions,” she said. “Kaplan is a bar prep company, so of course, has knowledge about the legal concepts being tested, the bar exam itself, how the questions should be structured. So the thinking was that it wouldn’t be a big change.”

Elderly and disabled people are encountering severe service disruptions as the Trump administration overhauls the Social Security Administration system.

Any major change that could impact how test-takers prepare for the exam, she noted, requires a two-year notice under California’s Business and Professions Code.

Advertisement

“Typically, these types of questions take years to develop to make sure they’re valid and reliable and there’s multiple steps of review,” Basick said. “There was simply not enough time to do that.”

Basick and other professors have also raised concerns that hiring a non-legally trained psychometrist to develop questions with AI, as well as determine whether the questions are valid and reliable, represents a conflict of interest.

The State Bar has disputed that idea: “The process to validate questions and test for reliability is not a subjective one, and the statistical parameters used by the psychometrician remain the same regardless of the source of the question,” it said in a statement.

On Tuesday, the State Bar told The Times that all questions were reviewed by content validation panels and subject matter experts ahead of the exam for factors including legal accuracy, minimum competence and potential bias.

When measured for reliability, the State Bar said, the combined scored multiple-choice questions from all sources — including AI — performed “above the psychometric target of 0.80.”

The State Bar has yet to answer questions about why it deviated from its plan for Kaplan to draft all the exam multiple-choice questions. It has also not elaborated on how ACS Ventures used AI to develop its questions.

Advertisement
Advertisement