Welcome
Daily Labor Report®

As AI Enters Hiring, Some Ask if It’s Injecting New Biases (1)

April 17, 2019, 9:43 AMUpdated: April 19, 2019, 6:50 PM

With each click of the mouse, the orange balloon on the screen expands, adding five more cents to a pile of cash. The sum, recorded at the left corner of the computer screen, reads $3.05. Click. Click. Fifteen cents goes up to 20 cents. Twenty goes up to 25.

At any click, the balloon may pop. The green “Collect” button presents a tempting choice: cash in, or risk it all and go for more?

The money isn’t real, but the tension is: the game will play a role in whether college interns will land highly coveted spots at the investment firm JPMorgan Chase & Co.

The balloon game is part of a series of 1- to 3-minute activities, using neuroscience and artificial intelligence, that are designed to measure behavioral traits that may predict future job performance. The technology was developed by the New York City-based tech startup Pymetrics, which says it’s serving a need at JPM and major companies to diversify their applicant pools and avoid the biases of traditional hiring practices.

The question is whether relying on AI-based interactive games or social media in the recruitment process trades some forms of bias for others. Employers may face lawsuits for not providing alternative measurement tools for older and disabled candidates, and government agencies are taking a close look at how the technology is used.

“Technology does have the potential to eliminate some forms of discrimination,” Kevin Kish, director of California’s Department of Fair Employment and Housing, said. “If you are showing up to an in-person interview and your interviewer has an unconscious or implicit bias against you, this type of tech can remove that bias. But all of this technology also has the ability to replicate human bias as well.”

Moreover, artificial intelligence-based tools, like Pymetrics, are likely to become more advanced and integrated in the hiring process this year, according to U.S. software company iCIMS.

Companies See Results

Pymetrics’ tools compare potential candidate results with those of successful company employees who’ve already taken the test. The goal is to rely less on human judgment that’s been shown to lend itself to discriminatory practices. Other major companies like Unilever Plc and Accenture Plc are using Pymetrics in a similar way, those companies told Bloomberg Law.

“We recognize that using technology in the selection of employees is a complex field, so we’re starting with a pilot,” Matt Mitro, JPMorgan’s head of campus recruiting, said in a firm blog post. “In our earliest testing with Pymetrics, we’ve seen extremely high completion rates and satisfaction scores among candidates—indicating that students are seeing the added value in this scientific approach.”

Unilever is still monitoring the tool’s success, but so far feels that the feedback has been positive, a Unilever spokesperson said.

“It allowed us to broaden the base of applicants from a handful of college campuses to thousands of universities, and we doubled the number of applications we received, while also condensing the hiring process from several months to about four weeks,” the spokesperson said.

Still, Amazon.com Inc. scrapped a machine-learning recruiting engine in 2017 after it was discovered that it judged female applicants worse than men. The programs vetted applicants using patterns in resumes submitted to the tech giant over a period of 10 years, which came mostly from men. As a result, it rated men’s applications higher.

Constantly Checking for Bias

“AI is like electricity or fire. It can keep you alive or it can kill you,” Frida Polli, Pymetrics’ founder, said. She created the company five years ago as a solution to workplace bias.

The company constantly checks its algorithms for potential bias, said Polli and Kelly Trindel, Pymetrics’ head of science and diversity analytics.

“I see the potential for concern and fear and all of that, but there is so much more potential for positive,” Trindel said.

The two said government regulations can be another way to prevent this kind of technology from being used in the wrong way.

Government Scrutiny

Workers over age 40 aren’t necessarily inept at technology, Laurie McCann, the senior counsel with AARP, cautions. Still, the changing technology is a concern at the association focused on seniors.

On gaming assessments McCann said, “On its face it seems to suggest it’s more geared to younger applicants. So then the question becomes: is this a reasonable way to test suitability for a job?”

New technology in hiring may exclude certain workers, especially those of a certain age or those with a disability, Cathy Ventrell-Monsees, a senior adviser with the EEOC, said.

The civil rights watchdog, which received more than 40,000 complaints of disability or age discrimination in fiscal year 2018, is examining how recruiters are leveling the playing field for those situations, she said. For example, if a company recruited on a college campus, are they also reaching out to older workers so the access to job opportunities isn’t closed off?

California’s Department of Fair Employment and Housing is taking steps towards regulating this area as well, Kish said.

The Fair Employment & Housing Council proposed new regulations last month that would, in part, prohibit the use of online job application sites that require entry of age to access an application, or that utilize algorithms that have the effect of screening out older applicants.

Accessibility an Issue

The rise in the use of technology in hiring and recruitment has been a particular concern for the National Federation of the Blind, Chris Danielsen, a representative for the organization, said.

“The issue is not so much that blind people don’t have digital skills, but rather whether the technologies being used are accessible to the screen reader technology that we use in order to access digital content,” he said. “We have definitely seen inaccessibility as an issue on job recruiting sites and online job applications and I think that, depending on the exact type of games being used, the kind of initiative that JPMorgan Chase is using is also concerning.”

Danielsen noted that the organization wasn’t familiar with the Pymetrics program.

“While there are definitely blind gamers, the accessibility of gaming is definitely in its very early stages, particularly where games are highly visual or graphical,” he said.

Pymetrics offers alternatives for people with disabilities but is still working on alternatives for blind candidates, Trindel said. However, all candidates can request accommodations for their disability as a substitute if an alternate version of Pymetrics is not available.

Risk for Litigation?

AI-based hiring practices are “a cutting edge issue” and will become a subject of litigation as the technology develops, James Hardin, an employment attorney in California, said.

Hard Rock Cafe, GameStop, Dart Container, and Albertsons were all the subject of lawsuits last year from blind applicants who couldn’t fully access their online job applications.

To avoid such a lawsuit, an employer should make sure it offers accommodations for applicants, Hardin said.

It’s also the employer’s responsibility to ask tech providers how they check for algorithmic bias, Trindel said.

“I hope all employers who are speaking to other vendors that are similar to us are asking hard questions and are getting good answers,” she said.

To contact the reporter on this story: Jaclyn Diaz in Washington at jdiaz@bloomberglaw.com

To contact the editors responsible for this story: Bernie Kohn at bkohn@bloomberglaw.com; Simon Nadel at snadel@bloomberglaw.com