Job seekers may assume a video interview with human resources is person to person.

But there’s a growing chance that an artificial intelligence system is there, unseen, studying a candidate’s facial expressions, speech patterns and other gestures for clues such as their personality. The systems can score candidates with algorithms customized by what the company envisions as an ideal hire.

The technology’s role has Illinois poised to be the first state, according to the National Conference of State Legislatures, to require companies to tell applicants that they’re being screened that way. Gov. J. B. Pritzker (D) is expected to sign the Artificial Intelligence Video Interview Act (H.B. 2557) into law. The bill would give applicants the option to have the video data deleted, as well as disclosing that AI is being used in the video interview.

“I found that most people think that three-minute video they send in is just to an interviewer, but what they didn’t know is they were being analyzed by a pre-screener system,” said Illinois Rep. Jaime M. Andrade, Jr. (D), who introduced the legislation.

Illinois has been a leader in regulating the use of such emerging technology, including a law directing the use of biometric data like fingerprints and retinal scans.

A PricewaterhouseCoopers 2017 study found that 40 percent of international companies are using AI-applications for HR functions.Companies can use AI to scan resumes for keywords and measure behavioral traits, in addition to analyze video interviews. Developers say AI can increase hiring productivity and help remove human bias from the interview process.

“Most people are better off to be reviewed by a properly designed algorithm than a person,” said Loren Larsen, chief technology officer at Utah-based HireVue, whose AI interview system has been used by clients including Unilever and JPMorgan Chase. “People are super biased” and they risk paying attention to “things that are not really relevant to the job such as how the name is spelled, who we hired last and the school they went to,” Larsen said.

But some worker advocate say such technology has shown human-created biases. Some developers rebuff such allegations, adding that they’re constantly tweaking the systems.

Another concern is whether applicants will disqualify themselves if they opt out of an AI-involved interview, and what legal risk companies may face if an algorithm has biases against certain applicants.

Unmasking AI Systems

Concerns aside, Illinois’ measure, which recently unanimously passedits Democratic-led legislature, may drum up similar legislation in other states, the American Civil Liberties Union and business groups told Bloomberg Law.

“This kind of use of AI is being heralded forward so quickly and there are so many questions about the technology and its fairness and whether it’s discriminatory in any way,” Jay Stanley, an ACLU senior policy analyst, said. “The first thing that is needed is transparency.”

The Illinois bill would require companies to provide information to users about how AI works and reveal that AI may analyze the applicant’s facial expressions and fitness for duty. The company also would have to get consent to use AI technology during the interview process.

The legislation also would require companies to delete all copies of the video interview 30 days after an applicant’s request. A spokeswoman for Pritzker didn’t immediately respond to requests for comment.

Companies such as JPMorgan Chase say use of video interviews offers valuable insight into how candidates think.

“In recent years, we began integrating pre-recorded video interviews into our application process so students had an opportunity to put their best foot forward, telling the story beyond their resume,” Matt Mitro, JPMorgan Chase’s global head of campus recruiting, said in a March statement. “And to be sure students really benefit, we offer them practice questions in advance to get comfortable with the technology and the chance to re-take if something goes awry.”

As for Unilever, its use of HireVue’s AI systems to help select candidates based “on data, avoiding biases, providing a more reliable output,” Leena Nair, chief HR officer of the London-based consumer goods company, said.

HireVue told Bloomberg Law that prompts in the process alert applicants that AI is in use. The firm touts years of experience with hundreds of clients and millions of interviews conducted with the technology, which can study thousands of points like an interviewee’s use of keywords, their tone and body language.

North Carolina-based Leoforce, which develops AI systems like a chatbot to use messaging windows to screen job candidates, “fully supports the direct and clear disclosure of the use of AI in the entire hiring process,” said Juan Benito, director of product management at Leoforce.

“If people understand AI, they won’t fear it as much, and disclosure is the first step in that understanding,” Benito said. He declined to discuss the specifics of the Illinois’ legislation.

AI Faces Bias Claims

But concerns that biases are embedded in AI stem from examples such as one at Amazon.com Inc., which stopped using an AI screening in 2017 after discovering that the technology judged female applicants worse than men.

The ACLU and others are questioning if potential biases persist in today’s AI systems, which are set to find the best outcome for a task based on information such as the profiles of existing high performers.

“There is potential for unfair and discriminatory outcomes here as companies blunder into these technologies,” Stanley said. “The entire notion that you can gain information from viewing a facial expression is completely without foundation.”

Stanley and others say circumstances, such as a person’s culture, impacts characteristics, something that could weigh negatively with algorithms in the video interview.

“We are prone to the idea that using technological systems might be more fair or objective than we are on our own,” said Sarah Myers West, a postdoctoral researcher at New York University’s AI Now Institute. She added that “it doesn’t matter the intentions of the developers, if the reality is that the impact of these systems is discriminatory or causes harm.”

Looking a Certain Way

HireVue says it employs psychologists and data scientists on its teams to continually test its systems for bias.

“We’re looking at what is common for what are top performers for that job,” Larsen said. “We’re looking at tens of thousands of data points. One thing doesn’t knock you down because there are so many points of data to look at.”

Larsen added that employers are already regulated by the Equal Employment Opportunity Commission, which sets guidelines to avoid workplace discrimination.

Such guidelines will make it necessary for companies to “be best informed about factors that are being taken into consideration” in the AI to avoid violating such laws, said Franklin Wolf, a Chicago-based employment lawyer at Fisher Phillips.

“When you take away the human element and place in the AI, the employer doesn’t know what factors the AI is considering,” Wolf said. “One of the first concerns is whether an employer is having an open and honest communication with the AI provider.”

Other potential issues include allowing employers to continually review AI processes, as well as accommodate disabled people and people who may not have the technology to do a video interview. Companies, for example, could provide kiosks to properly conduct the interviews, Adam S. Forman, a Chicago-based employment attorney with Epstein Becker Green, said.

Forman cautioned that the Illinois legislation will spur more efforts to regulate AI.

“I think regulation and statues like this are the beginning, not the end,” he said.