The California Bar Exam Had an AI Ghostwriter

Key Takeaways

  • A contractor used OpenAI’s ChatGPT to help write 29 multiple-choice questions for the February California Bar Exam.
  • This information came to light in a petition filed with the California Supreme Court.
  • The court justices were reportedly unaware of the AI use until the State Bar mentioned it recently.
  • The February exam already faced severe technical glitches, preventing some applicants from finishing.
  • Concerns are mounting about the validity of the AI-assisted questions and the exam’s overall fairness.
  • State officials and legal experts are demanding more transparency and accountability from the State Bar.

It turns out artificial intelligence played a role in crafting questions for the recent California Bar Exam.

A contractor hired by the State Bar, ACS Ventures Inc., used OpenAI’s ChatGPT to develop 29 of the 200 multiple-choice questions for the February test. This detail emerged in a petition filed late Tuesday with the California Supreme Court, according to Bloomberg Law.

The filing revealed that ACS Ventures created prompts designed to generate questions matching specific legal topics identified by State Bar staff, then fed these prompts into ChatGPT.

The Bar’s petition to the court was delayed because the justices requested an explanation about AI’s involvement, which they hadn’t known about previously.

This news adds another layer of complexity to an already troubled exam. Applicants faced significant technical problems during the February test, including repeated crashes that stopped many from completing it. Lawsuits and investigations followed those disruptions.

Now, the State Bar is asking the court to approve a scoring adjustment aiming to align pass rates with previous years. However, the revelation about AI-generated questions raises fresh doubts about the test’s validity.

Critics, like University of San Francisco law professor Katie Moran, feel the petition doesn’t adequately address concerns about how the AI was used or what safeguards were in place.

State Senator Tom Umberg expressed “sadness and great frustration” at the news. He plans to question Bar officials about who authorized the AI use and what lessons have been learned from the exam’s troubled rollout.

“This is a core responsibility of the State Bar, and the fact that they basically have failed in this core responsibility is absolutely unacceptable,” Umberg stated.

The petition explains that the Bar turned to ACS Ventures for question development when it became clear that other sources, including Kaplan NA LLC, wouldn’t provide enough questions covering all required subjects.

Interestingly, the petition notes that the decision to use ACS for question development wasn’t clearly communicated to State Bar leadership, and internal changes have been made to fix this communication gap.

While the Bar claims initial data suggests the exam measured competence adequately, 29 questions were ultimately removed from the final scoring. Six of these were among the 29 AI-assisted questions from ACS, discarded after review.

For comparison, 17 questions from Kaplan and six questions reused from a previous exam were also discarded.

Exam takers like Zack Defazio-Farrell expressed surprise, noting the irony: “As attorneys, we can’t use AI to file anything in court. We could be disbarred for that. So, they can get away with it, but we can’t.”

Independent, No Ads, Supported by Readers

Enjoying ad-free AI news, tools, and use cases?

Buy Me A Coffee

Support me with a coffee for just $5!

 

More from this stream

Recomended