California’s ADMT Regulations Reshape the AI Business Landscape

Nov. 5, 2025, 9:30 AM UTC

New rules under the California Consumer Privacy Act will have a huge impact on the world of data privacy and artificial intelligence.

These rules focus on three major areas: cybersecurity audits, risk assessments, and how companies use automated decision-making technology. While these rules go into effect Jan. 1, 2026, businesses that use ADMT to make significant decisions must comply with the ADMT requirements beginning Jan. 1, 2027.

Organizations doing business in California will need to consider how they secure personal data and use of ADMT. Meeting any of the applicability thresholds of these new rules could mean substantial changes in how to manage risk, protect consumer data, and use ADMT overall.

New Rules

In November 2024, the CPPA began a rulemaking process to update existing CCPA regulations and to draft new regulations regarding cybersecurity audits, risk assessments, and ADMT. The process drew significant feedback from tech companies, advocacy groups, and government leaders. ADMT emerged as a key point of debate, with concerns that the proposed rules could hinder AI innovation.

Several of the comments received during the rulemaking process expressed concern that the definition of ADMT was overly broad and could be interpreted to apply to nearly all common business software such as spreadsheets, calculators, databases, routine automation tools, and technologies that merely assist human decision-making. California Gov. Gavin Newsom urged the CPPA to use caution to avoid stifling that industry.

The final regulations reflect those concerns, scaling back restrictive elements and offering businesses more flexibility, such as narrowing the scope of the definition of ADMT to only encompass technology that replaces or substantially replaces human decision-making and no longer requiring businesses to conduct risk assessments or comply with ADMT obligations just for profiling a consumer for behavioral advertising.

Updates and Requirements

Key requirements of the finalized regulations regarding ADMT include the following:

Definitions: ADMT is defined as “any technology that processes personal information and uses computation to replace human decisionmaking or substantially replace human decisionmaking.” To “significantly replace human decisionmaking” means when a decision is made solely based on ADMT output, without human involvement. Human involvement requires a reviewer who can interpret and use the output, consider and analyze the output and other relevant information, and has the authority to make or change the decision.

Scope: The regulations apply when ADMT is used to make a “significant decision” (i.e., a decision that results in the provision or denial of financial or lending services, housing, education enrollment or opportunities, employment or independent contracting opportunities or compensation, or healthcare services) about a consumer. Expressly excluded from the definition are tools such as firewalls, anti-malware, calculators, databases, and spreadsheets, as long as they don’t replace human decision-making. Depending on deployment, this definition may include agentic and other AI technologies used by businesses.

Notice Requirement: When a business plans to use ADMT to make a significant decision, it needs to tell the consumer before or at the time of collection of personal information and, in clear and simple terms, why it’s using the technology.

Consumer Rights: Consumers have the right to opt out of and access information about ADMT used for significant decisions affecting them. Businesses aren’t required to offer consumers the ability to opt out if the business provides the consumer with a method to appeal the decision to a human reviewer with authority to overturn the decision, and where the ADMT is used for admissions, hiring, or work assignments as long as such use does not result in unlawful discrimination.

Risk Assessments for ADMT: Businesses must perform risk assessments when using ADMT for significant decisions or specific training purposes, and document the types of personal information processed, along with the system’s logic.

The CPPA has indicated that the regulations will likely evolve as tech and business practices change.

For businesses that fall under the CCPA, now is the time to begin planning for phased compliance with the final ADMT regulations. Businesses should inventory the technologies they use to determine whether any such technologies would be considered an ADMT under the regulations and whether they are being used to make a significant decision.

Businesses should also focus on establishing processes for consumers to submit requests to access and opt out of the use of ADMT and to comply with such requests. If a business is navigating other AI laws, such as the European Union’s AI Act or the Colorado AI Act, the ultimate goal would be to craft a strategy that can work across all such legal landscapes.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law, Bloomberg Tax, and Bloomberg Government, or its owners.

Author Information

Sharon Klein is partner at Blank Rome and co-chair of its privacy, security, and data protection practice.

Alex Nisenbaum is partner in Blank Rome’s privacy, security, and data protection practice.

Karen Shin is an associate in Blank Rome’s privacy, security, and data protection practice.

Write for Us: Author Guidelines

To contact the editors responsible for this story: Jada Chin at jchin@bloombergindustry.com; Jessica Estepa at jestepa@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.