Welcome
Daily Labor Report®

Law on Hiring Robots Could Trigger Litigation for Employers

Oct. 11, 2019, 8:45 AM

Illinois employers using “hiring robots"—software systems fueled by artificial intelligence to recruit and screen job applicants—will have to adapt to a new set of legal requirements under a law taking effect in 2020.

The law also could serve up a fresh avenue for employment litigation in a climate already crowded with civil rights claims, wage and hour claims, and, at least in Illinois, class actions alleging abuses of employees’ biometric privacy.

Beginning Jan. 1, 2020, employers will have obligations under the Artificial Intelligence Video Interview Act, a first-in-the-nation statute that imposes transparency, consent, and data destruction duties on employers using AI to screen applicants for Illinois-based positions. The AI video act, however, is silent on enforcement in the face of violations—a key problem for employers seeking to comply and aggrieved parties seeking remedies.

Attorneys advising businesses on their employment and privacy duties are telling clients to ignore any statutory gaps and treat the AI video act as a hard obligation that eventually will be tested in court.

“The statute is very unclear regarding enforcement and remedies,” said Howard Robbins, a partner in the labor and employment department of Proskauer in New York. “But it’s going to be tested somehow, whether it’s by private litigants or the attorney general. Somebody is going to be the test case and you don’t want to be that employer.”

David Oppenheim, a partner with Bock, Hatch, Lewis & Oppenheim LLC in Chicago, offered the same advice from a plaintiff perspective.

“There has to be some sort of enforcement mechanism and the courts tend to be the default,” Oppenheim said. “I suspect ultimately, violations will be actionable in some fashion—it’s just a question of under what theory.”

Rise of Hiring Robots

Employers increasingly are using AI-powered platforms such as Gecko, Mya, AutoView, and HireVue to streamline their recruitment processes, control costs, and recruit the best workers. Providers claim their technologies analyze facial expressions, gestures, and word choice to evaluate qualities such as honesty, reliability, and professionalism.

But the technology is also controversial. Privacy advocates contend AI interview systems may inject algorithmic bias into recruitment processes, and that AI systems could generate unfounded conclusions about applicants based on race, ethnicity, gender, disability, and other factors.

Illinois lawmakers sought to calm such worries earlier this year.

The AI video law, signed by Gov. J.B. Pritzker (D) Aug. 8, requires employers to disclose their use of AI analysis tools as part of the recruitment process. Before any video job interviews, employers would have to notify the applicant that AI may be used to analyze the applicant’s fitness for duty; provide applicants with an information sheet explaining how AI works; and, obtain consent for the use of AI technology during the interview. The law provides no special enforcement powers to the state and no private right of action to aggrieved parties.

Robbins pointed to several possible legal problems for employers, but highlighted the potential for algorithmic bias leading to lawsuits under state or federal anti-discrimination statutes. Given the newness of AI technology, Robbins said he fears subtle algorithmic features could screen out protected classes. He suggested employers think about their goals when implementing AI programs and understand exactly how their systems function. Ideally, employers also should test their systems against a pool of candidates for potential bias.

“I would advise employers not to defer entirely to assessments made by AI systems because of the risk of disparate impact discrimination litigation,” he said.

Protecting Biometric Data

A second concern for Robbins and other defense attorneys is whether the AI video interview law will sweep employers with no previous concerns about biometric privacy into causes of action under Illinois’ Biometric Information Privacy Act (BIPA), another first-in-the-nation privacy law that has generated a river of litigation.

In 2008, BIPA established privacy standards for Illinois residents’ biometric data, including fingerprints, retinal scans, voice prints, and scans of facial geometry. Creative plaintiff attorneys could use the two laws in tandem to allege an employer using hiring robots misused a job applicant’s biometric data.

“Companies that may not otherwise be collecting biometric information for the purposes of analytics or other reasons, may be drawn into the realm of BIPA and should be aware of the consequences of collecting facial geometry and voice prints, and recognize the duties around collecting biometric information,” said K.C. Halm, a privacy and data protection partner with Davis Wright Tremaine LLP in Washington.

Halm said AI interview systems could pull employers into the crosshairs of BIPA because facial geometry and voice prints are considered biometric identifiers under the law. Employers failing to comply with the notice and consent features of BIPA during their use of hiring robots could wind up in court, he said.

While other states, including Texas and Washington, have established biometric privacy laws, only BIPA has a provision granting aggrieved parties a private right of action. Penalties for violating the law are $1,000 per negligent violation and $5,000 per willful or reckless violation.

The law has generated significant litigation in Illinois courtrooms. Docket reports in Cook County Circuit Court show about 300 actions under BIPA. Among the defendants are major airlines, including American, United and Southwest; hotel chains including Hilton, Hyatt and Intercontinental; restaurant companies including Hooters, Wendy’s Co., and Burger King; and entertainment companies including Caesar’s Universal Parks & Resorts, Six Flags Entertainment Corp., and Life Time Fitness.

Many of the suits involve consumers asserting a retailer or service company misused their biometric markers, but more than half of the actions target employers for failing to gain proper consent for collecting and using fingerprints or handprints for timekeeping purposes.

Implied Right of Action?

Erin Bolan Hines, a Chicago attorney defending employers in class actions, said some businesses could be caught off guard by the AI video law’s silence on enforcement. To the extent an employer might be collecting and mishandling biometric data through an AI hiring system, though, she predicted plaintiffs would develop theories under BIPA to test the waters.

“There is a hurdle because there is no private right of action in the artificial intelligence statute, but there are some creative plaintiffs lawyers out there,” said Hines, who’s with Shook, Hardy & Bacon LLP.

Oppenheim, who won a major BIPA ruling in front of the Illinois Supreme Court earlier this year, said plaintiffs might find success with an “implied right of action” theory in the face of a notice, consent, or data destruction violation of the AI video law. In this context, a plaintiff could argue Illinois intended for an aggrieved party to have a path to the courthouse.

“There might be some argument that this is similar enough in scope to BIPA or some other statute such that you borrow the damages scheme,” he said. “I’m sure that’s going to be tested. Otherwise it’s almost as if this law is a nonbinding resolution. And unlike Congress, the Illinois Legislature doesn’t do that.”

Robbins, Hines, and Halm said employers using AI hiring systems should implement comprehensive policies and procedures pertaining to both the AI law and BIPA. Such policies and procedures should address: notification of job applicants; transparency; consent for the use of AI technologies and biometric information; limits on distribution of interview material; and destruction of information collected during the interview process.

“This AI technology is the new wave so employers need to be prepared,” Hines said. “We already saw what happened with BIPA.”

To contact the reporter on this story: Michael J. Bologna in Chicago at mbologna@bloomberglaw.com

To contact the editors responsible for this story: Martha Mueller Neff at mmuellerneff@bloomberglaw.com; Karl Hardy at khardy@bloomberglaw.com