Baker Donelson’s Vivien Peaden spotlights key actors in AI value chains that are impacted by the EU’s new AI regulation, with implications for US businesses that operate in the region.
The long arms of the EU’s new Artificial Intelligence Act reach to companies across all sectors that develop or distribute AI in the EU, along with those using AI systems that produce outputs affecting EU residents.
Several players in the EU’s AI value chains are impacted by the rollout of the world’s first AI regulation, approved by EU lawmakers March 13: providers, deployers, product manufacturers, importers, distributors, and authorized representatives.
Expanded Definition
The AI Act expands the definition of “AI system” to include any autonomous machine-based system that “infers, from the input it receives, how to generate outputs.” With the advent of generative AI, the act also applies to “general purpose AI models,” or GPAI models, that are key building blocks of AI systems with broad applications, such as DALL-E and OpenAI’s GPT-4.
The regulation categorizes AI systems based on potential harms they may cause, including “high-risk AI systems” used in critical infrastructure, employment, environment, credit scoring, election, border control, and health, among others. Key actors in the AI value chain face heightened compliance requirements for developing, using, or distributing high-risk AI systems.
Providers
A “provider” stands at the center of the AI value chain as it develops the AI systems or GPAI models under its own name or trademark. The act applies to AI providers under the following scenarios regardless of their places of establishment, and whether the AI is for free or commercial use:
- “Placing AI on the market”—first making available an AI system or a GPAI model in the EU
- “Putting AI into service”—the supply of an AI system for first use directly to the deployer or for its own use in the EU
- “Producing AI output”—developing AI systems that produce output impacting EU residents’ education, employment, product safety
For providers of a high-risk AI system, the AI Act sets stringent compliance requirements throughout their development and use lifecycle, including conformity assessment, risk management, and registration in an EU database.
Deployer
In AI technology, a “deployer” is the one that uses “an AI system under its authority.” The AI Act applies even where a non-EU deployer operates AI systems that produce output used in the EU. Deployers that operate high-risk AI systems must adhere to data governance, monitor AI performance, ensure personnel’s AI literacy, and notify other parties in the AI value chain where the high-risk AI materially malfunctions, among others.
Finally, a deployer could be re-classified as the “provider” of high-risk AI system(s) if it operates the AI systems under the deployer’s own name and brand, or otherwise modifies the AI systems for unintended purposes. Similarly, an importer or distributor will also be subject to heightened compliance requirements due to unauthorized branding or modifications.
Product Manufacturer
The AI Act applies to “product manufacturers,” where they provide, distribute, or use AI systems in the EU together with their products and under their own name or trademark. For example, where a US auto original equipment manufacturer incorporates an AI system to support self-driving features or serve as a safety component, then distributes the vehicle in the EU under its own name or trademark, such an OEM is a “product manufacturer” subject to additional compliance obligations.
Importer
Within the AI value chain, an “importer” is a gatekeeper to enable the EU market entry by certain AI systems from non-EU providers. An importer must also fulfill rigorous due diligence and record-keeping obligations before any EU product launch, including verification of a provider’s conformity assessment, technical documentation, among others.
An importer must also label the high-risk AI systems with its own contact information and registered trademark, similar to a customs clearance process that requires information about a product’s country of origin, content, and relevant disclaimers.
Distributor
A “distributor” refers to any natural or legal person that provides AI systems or GPAI models for distribution or use on the EU market. Unlike an importer, a distributor is not required to be established in the EU and isn’t necessarily the first party that releases the AI to the EU.
As a critical link in the AI value chain, a distributor must verify that related providers and importers comply with the AI Act. If a distributor suspects non-compliance, it must withdraw the applicable AI until such deficiencies are fully cured.
Authorized Representative
An “authorized representative” is an EU-based entity and functions as an intermediary between AI providers outside the EU on one hand, and European authorities and consumers on the other. A non-EU provider must appoint its authorized representative to carry out certain compliance obligations and procedures via a written agreement.
Actions for Businesses
As most products are set to incorporate AI, US companies should ask:
- What are our roles within the AI value chain?
- Are we a provider, deployer, importer, or distributor?
- Are we a product manufacturer that incorporates AI in our products?
- Will our products be distributed, operated, or otherwise provide output used in the EU?
- Are our AI systems classified as “prohibited AI systems,” “high-risk AI systems,” or other risk levels?
The AI Act imposes steep fines: up to 35 million euros (about $38.2 million) or 7% global turnover—whichever is greater—for prohibited AI practices, and up to 15 million euros or 3% of global turnover for violation of a party’s obligations within the AI value chain. Therefore, US companies must adapt their product design and compliance mechanism accordingly.
Outlook
Considering the EU market’s size (EU members account for around 16% of the world’s imports and exports) and strategic values, US-based companies must comply with the AI Act or risk losing the right to promote their offerings in the EU market. To steal a line from a popular adage, AI won’t replace your products, but products with AI governance will replace those without it.
This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.
Author Information
Vivien F. Peaden is of counsel at Baker Donelson focusing on data privacy and security issues under international and domestic laws.
Belana H. Knossalla of University of Göttingen contributed to this article.
Write for Us: Author Guidelines
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.