States Target AI That Tells Companies How Much to Pay Workers

Jan. 21, 2026, 10:10 AM UTC

The nascent practice of deciding workers’ wages via algorithms has caught the attention of advocates and state lawmakers who are pushing for regulation aimed at preventing discrimination and pay irregularities.

Bills in California, Colorado, Georgia, and Illinois proposed parameters last year on making compensation decisions with artificial intelligence systems, and at least a few more are trying again in 2026. Most of the bills aim to bar use of personal data that’s unrelated to work.

The push by policymakers and advocates looks largely proactive, proposing guardrails before businesses begin widespread use of AI to determine pay. As the technology spreads, businesses face new twists on old legal risks that vary somewhat between traditional employment relationships and independent contractor models.

The best-known example of algorithmic pay setting is among ride-hail and food delivery drivers. Uber Technologies Inc., Lyft Inc., and their app-based peers use algorithms to determine how much they’ll pay drivers and charge customers for each trip, informed by data collected on each person’s preferences and locations.

“Uber was the baseline model, and we are seeing that model being exported into different industries,” said Travis Hall, state director for the Center for Democracy & Technology.

Tech vendors offer automation software for setting workers’ pay targeted to sectors such as customer service, health care, delivery logistics, and manufacturing, according to a Washington Center for Equitable Growth report. But the report noted there’s limited evidence of how widely employers are using those AI tools.

Workers classified as independent contractors are most susceptible to variable, algorithm-set pay rates, including nurses who increasingly pick up flexible shift work through mobile apps, advocates say. Those work arrangements inspired the legislative proposals, yet it isn’t clear they would be covered since the bills reference employees and not independent contractors.

But traditional employers using AI tools to decide compensation levels face their own legal considerations, such as complying with wage and hour, antidiscrimination, and antitrust laws at the state and federal levels.

“It’s on the horizon,” said Jennifer B. Rubin, an employment attorney with Mintz, Levin, Cohn, Ferris, Glovsky and Popeo PC. Even employers that aren’t using those AI tools internally should ask their outside compensation consultants whether and how they do.

Legal Risks

Anti-bias and equal pay laws pose a risk for businesses automating pay decisions for employees.

“The employer doesn’t know what criteria the AI is using,” said Joseph G. Schmitt, an employment attorney with Nilan Johnson Lewis PA.

An AI tool could use historical pay data to decide to offer women lower salaries or hourly pay than men, which would violate state and federal anti-bias laws, he said. Businesses can mitigate their legal risks by ensuring humans make the final decisions, assessing their AI tools, and conducting pay equity audits.

“There’s definite risk to taking your responsibility as an employer and giving all of that over to a robot,” Rubin said.

Employers are safer from bias claims when using AI to decide pay rates for a class of employees, such as a retail chain’s store managers, “not based upon a particular employee’s characteristics,” Schmitt said.

But that strategy could give rise to anti-trust claims, if an industry’s major competitors all use the same AI tool and arrive at similar wage rates.

Recent price-fixing claims against owners of apartment buildings—amid growing scrutiny in many industries—could be an indicator of what’s to come in labor markets, said Robin S. Crauthers, an antitrust attorney with McCarter & English LLP and former Justice Department trial lawyer.

The risk is softened by the need for intentional collusion among competitors for a price-fixing claim to stick, she said, but defending a collusion claim is expensive regardless.

“The use of AI to set employees’ wages would not give a company a free pass,” Crauthers said. “The core question is: did the competing buyers of the labor stop making independent pay decisions?”

California drivers sued Lyft and Uber in 2022 accusing them of illegal wage-fixing, but dropped their case in 2024 after a judge ordered their claims to arbitration.

State Proposals

State-level efforts remain active in 2026, with bills up for consideration so far in Georgia, Illinois, Maryland, and New York. Some combine advocates’ concerns about wage-setting with surveillance-based pricing for consumers.

California lawmakers’ attempt last year at a broader regulation of AI-powered workplace management, the “No Robo Bosses Act,” was vetoed by Gov. Gavin Newsom (D).

Any state proposals to restrict employers’ use of AI are subject to threats of federal preemption including President Donald Trump’s executive order from December.

A key motivation behind the bills is that businesses shouldn’t be allowed to use personal data unrelated to work to set the smallest possible salary, or “desperation wage,” said Laura Padin, director of work structures at the National Employment Law Project.

The bills define surveillance to cover any gathering or purchase of personal data including biometric characteristics, parenthood status, behavior patterns, weight, and home address.

“You shouldn’t be charged different amounts based on who you are or paid different amounts based on who you are,” Padin said.

“At a systemic level, the house always wins,” Hall said. “They are collecting the information in order to pay the minimum they have to for a particular shift.”

States could also regulate algorithmic wage-setting through broader measures such as an Illinois law requiring notice whenever AI is used for employment decisions.

The practice of setting pay rates via AI also heightens the need for regulators to enforce worker classification laws, said Jeremy Abay, an employment attorney at Pond Lehocky Giordano Inc.

“The courts and the regulators are much more forgiving if your worker is an independent contractor,” he said. “It’s going to push more companies to misclassify their workers.”

To contact the reporter on this story: Chris Marr in Atlanta at cmarr@bloombergindustry.com

To contact the editors responsible for this story: Rebekah Mintzer at rmintzer@bloombergindustry.com; Genevieve Douglas at gdouglas@bloomberglaw.com

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.