Bloomberg Law
Free Newsletter Sign Up
Bloomberg Law
Advanced Search Go
Free Newsletter Sign Up

Law Firms Turn to AI to Vet Recruits, Despite Bias Concerns

Feb. 24, 2022, 11:00 AM

New York’s Cadwalader passed over a law student vying for a summer job until an artificial intelligence algorithm flagged her as a good match.

“For whatever reason, they just didn’t evaluate her that strongly when she was interviewing,” said Pat Quinn, chairman of Cadwalader, Wickersham & Taft. “Yet, she clearly has the goods.”

Law firms struggling to expand candidate pools and diversify workforces are turning to AI for help, even as regulators scrutinize the technology to ensure it doesn’t exacerbate biases rather than lessen them. A law set to take effect in New York City next year will limit the use of the technology in hiring and require that employers test recruiting algorithms for bias, while the U.S. Equal Employment Opportunity Commission is taking a closer look at the tools.

Firms that have adopted the technology developed by vendor Suited AI and used by Cadwalader include Skadden, Arps, Slate, Meagher & Flom; Sullivan & Cromwell; Willkie Farr & Gallagher; Fried, Frank, Harris, Shriver & Jacobson; Wilson Sonsini Goodrich & Rosati; and Haynes and Boone. Each of those firms, other than Cadwalader, declined to discuss how they are using the tool.

The technology forces employers to consider traits of prospective workers that often get lost in traditional interview processes, said Matt Spencer, chief executive officer of Suited, based in New York.

The traits include attention to detail, logical reasoning, critical thinking, stress response, values, and personality traits, he said.

“When people misunderstand AI, and therefore choose not to use it, they are eliminating the most powerful tool we have to remove long-developed and ingrained human biases from the hiring process,” Spencer said.

Questionnaire Process

Suited’s approach works like this: Job candidates and attorneys working at a firm complete identical questionnaires. Technology then reviews questionnaire results to predict how job applicants match up against existing top performers at the firm.

One job seeker who sat for Suited questionnaires described them as a cross between a personality test and the logical reasoning portion of the Law School Admission Test.

The candidate, who requested anonymity so as not to be penalized by an employer, said different firms asked her to complete the questionnaire at various stages of the hiring process. Some used it as the first step after submitting an application, and others after rounds of interviews.

Skadden requires candidates invited for a call-back interview to sit for the test, Christina Fox, an associate director of attorney talent at the firm, said in a September webinar that Suited hosted.

One of the first hurdles was getting Skadden lawyers to sit for questions to help the algorithm understand what traits high performers share, Fox said.

“There’s quite a bit of skepticism that comes with using AI,” she said. “It took a lot of one-on-one phone calls” and “an internal campaign” to encourage participation.

‘Thriving Industry’

Roughly 40% of U.S. employers use tools such as Suited’s to vet job candidates, and 44% use them to identify possible applicants, according to a 2019 survey from the Society for Human Resource Management.

HireVue, a South Jordan, Utah-based artificial intelligence firm, said it has clients that include over 30% of Fortune 100 companies, such as BP Plc, Delta Air Lines Inc., and Hilton Worldwide Holdings Inc.

The technology is used in some cases when thousands of applications are submitted for a single position, said Lindsey Zuloaga, HireVue’s chief data scientist.

Clients that use the tools say they improve diversity and grow the number of candidates considered for positions, said Niloy Ray, a Littler Mendelson lawyer in Minneapolis.

“These tools are in a growth pattern,” Ray said. “It’s a thriving industry.”

One of the touted benefits—improving workforce diversity—is of particular interest to law firms.

Skadden turned to Suited’s product because of limited on-campus recruiting opportunities and to “continue to expand the candidate pool, especially as we look for students of color and other diverse candidates,” Fox said on the webinar.

Attorneys of color make up about 28% of all law firm associates, according to data compiled by the National Association for Law Placement. But just under 11% of law firm partners were people of color last year, and about 4% were women of color.

Bias Replication?

Federal regulators and lawmakers are putting the tools under a microscope. They question, for instance, whether the tools are calibrated to what a company workforce looks like now rather than what it should look like.

“Machine learning can replicate the same biases of human decision-makers,” said Christine Webber, a partner at Cohen Milstein Sellers & Toll who represents employees in discrimination cases.

Companies including Inc. and Meta Platforms Inc. have been targeted for using tools for hiring or recruiting. The EEOC has warned of the tools’ potential for perpetuating bias.

Several recent laws are aiming for transparency, requiring companies to disclose the AI tools to employees and candidates. One of the boldest, set to take effect in New York City in January 2023, will require companies to conduct a bias audit of the tools.

“Even when a tool doesn’t consider race, gender, or another protected category, the computer does its own analysis to show what is predictive of being hired,” Webber said. “The exclusion of people from consideration can be a template on past decision-making.”

Meeting Guidelines

Suited’s Spencer and HireVue’s Zuloaga said their companies go to great lengths to eliminate bias.

Before being deployed, every Suited model must meet the guidelines set out by the EEOC’s “four-fifths” rule, Spencer said. That means the selection rate for every demographic group must be at least four-fifths of the rate of the group with the highest selection, he said.

Companies that adopt the technologies research their effectiveness and make sure they don’t reinforce bias, said Littler Mendelson’s Ray.

“The idea that algorithms could be biased and disadvantage protected groups is not controversial,” he said. “That they are biased and, if so, that they are more biased than the process conducted by humans—that’s a harder question.”

In Cadwalader’s view, the technology has been a success. The law student who was flagged by the technology was among the top performers in her group by the end of the summer, Quinn said.

“When she actually got in front of people and worked with them,” he said, “they thought she was extraordinary.”

To contact the reporters on this story: Erin Mulvaney in Washington at; Chris Opfer in New York at

To contact the editors responsible for this story: John Hughes at; Martha Mueller Neff at