- Landmark regulation requires summaries of content models use
- Compliance could tip off rights owners their work is being used
The European Union’s historic Artificial Intelligence Act taking effect this week has the potential to expose even US companies seeking to comply with its intellectual property provisions to copyright litigation, legal researchers say.
The set of regulations take a risk-based, sector-agnostic approach to enacting the most comprehensive guardrails yet on the development and use of AI. Over the next few years, companies will be subject to staggered deadlines with the looming threat of hefty fines for noncompliance.
That includes Silicon Valley companies chasing EU customers. The AI Act will have trans-Atlantic impact like the groundbreaking General Data Protection Regulation, which has made website cookie disclosures ubiquitous—and rendered
Beginning in August 2025, the new act will require companies providing “general-purpose AI models” in the EU to make public a “sufficiently detailed” summary disclosing the content used to train the AI. There’s currently no equivalent federal requirement in the US, though California lawmakers are considering calling for similar disclosure.
Those EU summaries could provide fodder for authors and other creatives trying to show that AI firms are using their works without permission.
“Now, this summary, arguably, is going to act as pre-action discovery on every general-purpose AI model which touches the EU,” said attorney Matt Hervey, the general editor of the reference text The Law of Artificial Intelligence.
The act puts EU intellectual property law’s public interest focus in sharp contrast with US norms, which center creators’ and inventors’ interests, Harvard Law School professor Ruth Okediji said. As that European approach reaches across the ocean, US companies will face some regulatory questions for the first time.
“The EU markets are more regulated explicitly on the grounds of human welfare and human well-being,” Okediji said. “But we tend to, in our innovation culture, we innovate first and then we figure out the consequences and how to regulate it later.”
Litigation Risks
Article 53 of the AI Act requires providers of general-purpose AI models to “make publicly available a sufficiently detailed summary about the content used for training of the general-purpose AI model.”
These generative models, like
The specifics of what “sufficiently detailed” includes still need to be ironed out by EU officials, who will later issue templates for the disclosures. They’ll likely require a degree of specificity that would facilitate copyright owners seeking to enforce their rights, researchers and attorneys following the act said.
“It’s hard to see a scenario where we don’t end up with some granularity,” said Joe Jones, director of research and insights at the International Association of Privacy Professionals. That could provide a boon to copyright-holders who feel their work was stolen to train the new technology.
Authors and creatives have already filed a number of lawsuits against AI companies. In early July, former Arkansas governor Mike Huckabee and comedian Sarah Silverman joined forces, consolidating their high-profile lawsuits accusing Meta of using a database of pirated books that contains their work to train its LLaMa generative AI model.
For AI companies that haven’t already publicized the data sets used to train their models, the EU-required summaries could be “tipping off those parties that have proprietary interests,” Jones said.
That, in turn, could set AI providers up for costly litigation.
The effect could be most felt in cases that haven’t been filed yet, because parties in ongoing disputes probably already have access to similar information, if not more, through the discovery process, said Mauricio Uribe, co-chair of Knobbe Martens’ software/IT and electrical practice groups.
Those litigation risks might push AI companies to roll out EU-specific models when the disclosure requirement comes into force.
“Major AI companies are talking about having separate models for Europe because of this impact,” Hervey said.
Crafting Summaries
The act’s requirements regarding general-purpose AI models will become binding in August 2025, though models already on the market by then will have until 2027 to become compliant. Maximum fines for violations could total up to either 3% of the provider’s annual worldwide turnover or €15 million (roughly $16.2 million), whichever is higher.
IP and AI researchers stressed that building these summaries could be a heavy lift, especially for organizations that haven’t rigorously tracked what they’ve fed into their models.
The challenge for engineering teams will be to uncover the data they’ve used to train AI, Jones said. Because it’s rare for technologies to be developed with regulatory compliance in mind from the beginning, collaboration between legal and engineering teams will be important, he said.
For smaller companies with fewer resources, the task of retrospectively unraveling how AI was trained could be even more difficult.
“I think early regulation generally kills innovation and benefits incumbents,” said Venky Ganesan, a partner at Menlo Ventures.
Ganesan said it’s too early for regulation given that AI is still taking shape. A voluntary set of policies designed by the people at the center of the industry would have been a more effective starting point, he said.
“The tech industry needs to come together as a group and propose self-regulation, because they have the best knowledge of the technology,” Ganesan said.
Even with at least a year before the first disclosures have to be published, Uribe said, companies should revisit compliance preparations regularly while awaiting further EU guidance to make future deadlines less daunting and costly.
“Even though you don’t have to do something now,” he said, “start early.”
To contact the reporter on this story:
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
See Breaking News in Context
Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.
Already a subscriber?
Log in to keep reading or access research tools and resources.