- Agency aims to boost data transparency, worker protections
- Biden administration focused on AI’s impact on workers
Employers must get permission from workers when using external consumer reports about them—including “black box” AI or algorithmic scores—and allow employees to dispute information used in background checks, the Consumer Financial Protection Bureau said.
Companies have increased their surveillance of workers, including using credit reports and third-party background checks, when making decisions whether to hire, demote, or fire them, the CFPB said. Companies also provide data about worker performance to third parties such as consumer credit reporting companies, the agency noted in a Thursday circular.
The new guidance is part of the Biden administration’s broader efforts to scrutinize the use of algorithms and third-party data for employment, credit, and other decisions.
The use of outside background checks and other credit reporting tools to monitor and check on workers, and the sale of data collected about workers, are covered by the 1970 Fair Credit Reporting Act, the CFPB said.
“Workers are protected not just by federal labor law, but a whole set of other federal laws, too,” CFPB Director Rohit Chopra said at a joint field hearing with the Department of Labor in Okemos, Mich.
Under the federal credit reporting law, companies are required to get consent from workers when collecting data about their work using biometric and other electronic tracking devices, and to notify workers when purchasing such reports, the CFPB said.
Companies are also required to tell workers how the data in those reports are used to make employment decisions, the agency said.
Workers are allowed to dispute information in background reports, and companies aren’t allowed to sell data collected about workers under the law, the CFPB said.
AI Risks
The CFPB warned that companies are using consumer reports to predict whether workers might be inclined to engage in union organizing or the likelihood an employee will leave a job. The reports can also be used to reassign workers, boost output requirements, and take disciplinary actions, the CFPB said.
Outside of the workplace, companies can use credit reports and outside background checks to monitor their workers’ social media and other activities, helping to create a sort of social score, Chopra said.
Data collection by employers is a heightened risk given the rise of artificial intelligence, Acting Labor Secretary Julie Su said at the field hearing.
In many instances, data collection using ID cards and wearable devices can even be used to control worker movements, Su said.
Shannon Kozlowski, an
An Amazon spokesperson said the company doesn’t use the camera technology Kozlowski referenced to monitor employees. Instead, it’s used to protect the safety and security of workers and to secure inventory, the spokesperson said.
“We comply with applicable laws and regulations in the jurisdictions in which we operate—including those related to employment decisions, employee break periods, technology use, and any others that pertain to our workplace,” the Amazon spokesperson said.
The Labor Department issued guidance this month warning employers against using AI to interfere with labor organizing or to reduce wages, break times, or benefits.
“I believe there is a way to embrace innovation, but not sacrifice worker well-being,” Su said Thursday.
To contact the reporter on this story:
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.