- AI programs can detect workplace hazards, notify employers
- Attorneys urge employers to consider labor, privacy issues
As artificial intelligence gains a foothold in workplace safety and health compliance, attorneys are advising employers to consider evolving legal concerns about the data the monitoring systems collect and how it will be used.
AI has been transforming the employment landscape, offering businesses a new tool for recruiting, hiring, monitoring, and disciplining workers. For worker safety, the technology can identify hazards like near misses between humans and machines or spot a worker improperly lifting a box, potentially reducing the risk of harm to employees and liability for OSHA violations.
“The technology is absolutely incredible,” said attorney Bradford Kelley, a shareholder at Littler Mendelson P.C. in Washington who advises clients on labor and AI-related issues. “It provides a lot of promise for increasing worker safety and health.”
But before employers begin using AI programs in their safety efforts, they must take into account things like labor and privacy issues that could arise from their use, attorneys said.
“There is a lot of pre-work you should do before you turn on tracking tools,” said Jennifer Betts, a labor attorney in Pittsburgh and co-chair of Ogletree Deakins’ Technology Practice Group. “Even a well-intentioned employer could potentially get themselves in trouble if they don’t do a careful evaluation of all the possible legal risks of these types of tools.”
‘Omnipresent’ Safety Manager
Intenseye and Voxel Labs Inc. are two companies developing and offering worker safety AI software and monitoring programs.
The systems use workplace video cameras to feed live images that are processed by AI software, usually at a cloud data processing center.
When the AI recognizes a workplace safety violation, such as a forklift entering a pedestrian-only walkway, the software notes what happens and creates a report. The employer has ownership of the images.
“Imagine an omnipresent, ever diligent safety manager who has access to all of the information inside a facility,” Voxel CEO Vernon O’Donnell said.
In severe cases, the report could go immediately to a facility and be delivered to a supervisor. But more often, the incident reports are collected and analyzed by the system for later review by supervisors or safety managers.
A goal is to spot problem areas before there is an incident and act on that information rather than taking only preventative actions after an accident.
Eye on AI
The Biden administration has made regulating AI a top priority, calling on federal agencies to set guardrails for the technology. The US Labor Department was tasked with developing principles and best practices “to mitigate the harms and maximize the benefits of AI for workers” by addressing a wide range of topics, including “workplace equity, health, and safety.”
So far, though, the US Occupational Safety and Health Administration hasn’t addressed AI’s use in safety programs and didn’t respond to requests for comment about agency plans.
In the absence of official guidance, attorneys advised employers to consult their human resources and legal staffs to identify potential pitfalls before implementing AI to spot safety hazards.
While employees generally have limited privacy rights at work, some labor laws place restrictions on surveillance, particularly for monitoring union activity.
Intenseye and Voxel executives said they are aware of workers’ privacy concerns. They each said their systems aren’t designed to follow individual workers. Facial scans and the collection of other biometrics that would identify a specific worker aren’t obtained or used, they added.
“We never process a worker’s face or identify who is working,” said Intenseye CEO Sercan Esen.
O’Donnell said the company’s goal is to “influence cultural change and behavioral change.”
“It’s not about, ‘Hey, Joe forgot his hard hat today,’” he said. “It’s about: you have 76 hard hat-missing incidents.”
Employers might also be required to notify and bargain with their workers’ union before implementing new technology.
“If they were to try to cram it down on a unionized workforce that could create some legal challenges,” said Thomas Lenz, a partner and labor law specialist with Atkinson, Andelson, Loya, Ruud & Romo in Pasadena, Calif.
“There’s incredible value in being transparent and being open about the uses of these technologies and saying this is why we’re doing it,” he said.
To contact the reporter on this story:
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
See Breaking News in Context
Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.
Already a subscriber?
Log in to keep reading or access research tools and resources.
