Employers have a responsibility to inspect artificial intelligence tools for disability bias and should have plans to provide reasonable accommodations, the Equal Employment Opportunity Commission and Justice Department said in guidance documents.
The guidance released Thursday is the first from the federal government on the use of AI hiring tools that focuses on their impact on people with disabilities. The guidance also seeks to inform workers of their right to inquire about a company’s use of AI and to request accommodations, the agencies said.
“Today we are sounding an alarm regarding the dangers of blind reliance on AI and other technologies that are increasingly used by employers,” Assistant Attorney General
The DOJ enforces disability discrimination laws with respect to state and local government employers, while the EEOC enforces such laws in the private sector and federal employers.
The EEOC’s technical assistance document includes guidance for employers using artificial intelligence hiring tools, including questions to ask vendors to ensure the products were built with accessibility in mind.
The agency announced in October that it would begin to look into how to prevent discrimination by studying how employers use AI for hiring, promotions, and firing workers. The last time the commission formally weighed in on hiring tools was in 1978.
Disability advocates have been calling for some sort of action on AI for years. Certain hiring tools, such as personality tests and camera sensors, can potentially pose an ableist bias, they say.
“We need to make sure that as we look to the future, we don’t leave anyone out,” EEOC Chair
As many as 83% of employers, and as many as 90% among Fortune 500 companies, are using some form of automated tools to screen or rank candidates for hiring, according to Burrows.
The EEOC does not currently track data on discrimination related to artificial intelligence.
Long Time Coming
Employers have been waiting for some sort of guidance on using AI tools for several years, according to Jenn Betts, a shareholder at Ogletree Deakins. Betts called it a “concrete guidance” that appears to borrow from recommendations from advocacy and industry groups that have been pushing for ethical uses of AI.
“I think many employers are going to look at this as a welcomed development,” she said.
The EEOC guidance directs employers to be critical of AI tools they use and includes a list of questions to ask vendors. Among other things, the EEOC suggested that those hiring “only develop and select tools that measure abilities or qualifications that are truly necessary for the job—even for people who are entitled to an on-the-job reasonable accommodation.”
Natasha Duarte, a project director at Upturn, an advocacy group that has studied the use of AI tools in hiring, said that’s critical because those tools can potentially screen out employees with disabilities that are not indicative of whether they can do the job or not.
“That also helps avoid other types of discrimination as well, like race and gender discrimination, because it relies less on generalizations about the type of person that’s been successful in this job in the past, instead of directly measuring if the applicant can do a job,” Duarte said.
Suit Filed
Although there “is a lot to like” in the guidance, the EEOC is limited in its ability to study specific ways in which these tools discriminate against workers because it can generally only access that data once they are pursuing a charge of discrimination, Duarte said.
For the first time, , the agency this month sued an English-language tutoring services company “iTutorGroup” for allegedly programming its online recruitment software to automatically reject older applicants because of their age.
The agency could also start a technical study, but it still wouldn’t have the authority to collect data from employers.
“That’s one thing that needs to happen: Congress needs to give the EEOC more authority to be able to do that,” Duarte said.
To contact the reporter on this story:
To contact the editors responsible for this story:
To read more articles log in.
Learn more about a Bloomberg Law subscription