- Applicants risk rejection for using AI in applications at some schools
- Prospective students seek official policies, survey says
Law school applicants want official policies with clear guidance on how they can use AI when applying for admission, according to a new survey from education company Kaplan.
The majority of respondents to a recent Kaplan survey of pre-law students said law schools should clearly state how generative AI may or may not be used in the admissions process. In the survey, 83% of students said schools should have official policies on AI use in admissions. The survey, scheduled for release on March 18, was conducted in February.
But according to an October Kaplan survey of admissions officers, 54% of law schools had no official policy on AI in applications.
In the absence of official guidance from law school accrediting and advisory bodies, law schools are taking disparate approaches to AI use in admissions essays even as applications to law schools rose 20% year over year, according to the Law School Admission Council.
On the actual use of AI in the admissions process, most applicants disagreed with the idea of allowing use of GenAI like ChatGPT, the new survey said. Of the roughly 300 pre-law students surveyed, only 27% agreed that applicants should be permitted to use AI to help write admissions essays.
“Without clear guidance from law schools, students are likely to assume using AI is completely acceptable,” said Kaplan’s Amit Schlesinger, executive director of legal and government programs at the global educational services company in a recent interview. “Law schools need to catch up and have some enforceable and transparent policies,” he said.
Arizona State University Sandra Day O’Connor College of Law was the first to announce it would allow applicants to use AI in application essays with disclosure and certification in 2023, with other schools since following with similar policies.
Other schools, like Columbia University Law School, have gone in a different direction—strictly prohibiting AI in applications and threatening rescission of an offer or even disciplinary proceedings. And some schools, like the University of Pennsylvania Carey, refer to policies requiring applicants to certify that the information they are providing in the application is their “sole original work,” without specifically mentioning AI.
“If a school hasn’t explicitly stated its rules, the safest approach is to ask directly,” Schlesinger said.
On the flip side of the issue—whether AI should be used to evaluate law school applicants—the survey was clear: would-be law students aren’t enthusiastic. According to the Kaplan survey, 75% of pre-law students said they’d feel more comfortable applying to a law school that doesn’t use AI in admissions, and 89% want law schools to at least disclose whether they’re using AI to evaluate applicants.
As for how AI plays into the process from the schools’ vantage point, Washington University in St. Louis School of Law allows the use of AI tools in preparing the essay but still takes a close look at the applicants’ other writing samples such as in the LSAT for evidence of strong writing skills, said Katherine Scannell, vice dean for institutional success. Applicants may be asked to submit further writing samples if admissions officers have doubts, she said.
Kristy Lamb, assistant dean of preprofessional advising at New York University, said she tells students: “Generative AI may change some aspects of the legal profession, but clear and quick communication will be critical as a law student and lawyer. Let law schools see your potential in this area by showcasing your own work in your application.”
To contact the reporter on this story:
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.