A first-of-its-kind US copyright lawsuit targeting AI art generators, which have enjoyed explosive growth in recent months, could limit the number of images the tools ingest for training, ultimately affecting the content that they produce.
A group of artists filed a potential class action against billion-dollar company Stability AI Ltd. along with two other art generator makers, Midjourney Inc. and DeviantArt Inc., over their use of copyrighted images to train artificial intelligence tools. They claim that the generators downloaded and used billions of copyrighted images without obtaining the consent of or compensating any of the artists.
In addition to damages, the lawsuit—filed by Sarah Andersen, author of the web comic “Sarah Scribbles,” and fellow artists Kelly McKernan and Karla Ortiz—asks the court to stop the AI generator companies from using artists’ work without permission, which could mean upending how AI tools produce content.
“It will change how the model performs if instead of billions of images, they’re using much more tightly curated data sets,” said Ryan Abbott, an attorney at Brown Neri Smith Khan LLP.
Though this lawsuit is the first involving AI-training images, it follows two programmers’ November 2022 copyright suit accusing OpenAI,
Getty Images announced on Tuesday that it initiated copyright infringement legal proceedings against Stability AI in a UK court, alleging it used Getty’s digital images without a license.
Fair Use
The potential class action, filed in the Northern District of California, describes Stability AI’s popular text-to-image model Stable Diffusion as “merely a complex collage tool.” The new images it generates are derivative works of the pieces the tool was trained on, the suit claims.
Some attorneys agreed that the copyright infringement allegations have merit. But Stability AI told Bloomberg Law in a statement on Tuesday, “Anyone that believes that this isn’t fair use does not understand the technology and misunderstands the law.”
The success of a fair use defense will depend on whether the works generated by the AI are considered transformative—whether they use the copyrighted works it in a way that significantly varies from the originals.
“The minute you’re able to make a work that’s transformative, you removed it from the ambit of this idea that it’s a mere derivative of the original work,” said Vivek Jayaram, founder of Jayaram Law.
Jayaram said that previous case law, particularly the Supreme Court’s 2021 Google v. Oracle decision, suggests that using collected data to create new works can be transformative. In that case, Google’s use of portions of Java SE code to create its Android operating system was found to be fair use.
“Using IP or that training data to allow others to use the tool to create new work—that construct, in and of itself, I don’t think will be found to be completely infringing,” said Jayaram. He cautioned, though, that without transformation of a work, AI generators essentially serve as digital art counterfeiting machines.
Matthew Sag, a law professor at Emory University focused on AI, said that AI training could be considered fair use under a 2015 Second Circuit case that the Supreme Court declined to review. The decision in Authors Guild v. Google found that digitizing books and displaying snippits of them without permission to allow text searching is transformative fair use.
In that context the fair use was clear, Sag said, “because none of the original expression leaks out in the end.”
“Information about the books isn’t the same as the content of the books,” he said. “A library catalog is not the library, it’s just the catalog.”
Some artists, however, say that aspects of their original works—even watermarks—have shown up in AI-generated images. Sag allowed that this could complicate the analysis and that in such cases “the original copying might not be fair use.”
“Fair use is a very, very fact-intensive inquiry,” said Jayaram. “What is transformative and what is not transformative is going to vary piece by piece, judge by judge, and jury by jury.”
Can Copyright Protect Style?
In the complaint, the artists allege that AI tools permit users to create works “in the style of” a given artist instead of commissioning or licensing an original work, violating the rights of millions of artists and resulting in works that compete with the originals.
Style has proven difficult, if not impossible, to protect with copyright, some attorneys say.
“Style is an idea, style is a technique, it’s a process,” said University of Kentucky law professor Michael Murray. “These are things that are not protected by copyright.”
Given an AI generator’s ability to draw directly from the works of art it’s trained on, however, there are circumstances in which “the line between emulating style and wrongful copying is not always apparent,” said Columbia Law Professor Jane Ginsburg.
The circumstances may require a reckoning with current copyright standards, said Sag.
“I think the question for copyright law is, ‘Do we need to push this line out a little bit, in light of the new reality that people can do this kind of style transfer at a scale that we just never really imagined before?’” he said.
What’s Next?
“We all knew lawsuits of this nature were going to come, and now they have—and they pose some very interesting legal questions,” said Jayaram.
Noting the commercial importance of these issues, he predicted that “after making its way through the courts, AI and IP is something that’s going to get legislative attention in the not-too-distant future.”
Attorneys also said that this is just the beginning of copyright litigation concerning AI generators. Artists may square off against each other over original and AI-generated artworks, with the companies that produce generators being cited for contributory infringement, they said.
Some are afraid that copyright infringement suits against AI generator makers may stifle innovation in the field, which also includes OpenAI’s DALL-E text-to-image tool.
“This lawsuit is almost a perfect example of a little fire, people complaining about one particular aspect of generative AI,” said Murray.
If it succeeds, he said, “that would just slam the doors on the research, it would slam the doors on certainly the applications that Stability AI has created and DALL-E, and all the other products that follow a very similar formula.”
Joseph Saveri of Saveri LLP and Matthew Butterick, who represent the artists suing Stability AI, disagree. They say that it’s possible to create a licensing model—similar to the music world’s transition from file-sharing sites like Napster to platforms such as Spotify and iTunes that license content legally—that would allow artists and AI to peacefully coexist.
“There are ways of allowing the technology to be used without damaging the artists in this case,” said Butterick.
To contact the reporter on this story:
To contact the editors responsible for this story: