Human Artists Lose Ground in Legal Battle Against AI

Human Artists Lose Ground in Legal Battle Against AI



A federal judge appears poised to dismiss most claims in a high-profile lawsuit brought by artists against AI companies. The case has thrust thorny questions about copyright into the limelight as generative AI enters the mainstream.

At a hearing earlier this week, U.S. District Judge William Orrick said the artist plaintiffs should better differentiate their allegations against AI art companies Stability AI, MidJourney and DeviantArt. The news was first reported by Reuters.

The proposed class action lawsuit alleges Stability “scraped” billions of images from the web to train its text-to-image AI system Stable Diffusion, potentially infringing copyrights. The lawsuit alleges that the images generated by Stable Diffusion are derivative works of the copyrighted images, which constitutes infringement of the exclusive rights of the owners of those images.

Yet Orrick noted it remains “implausible” that specific plaintiff works are implicated, given the scale of training data involved.

Artists vs AI

To understand the lawsuit in simple terms: The plaintiffs say AI companies trained their models using their artwork as input without their permission, and so, the outputs provided by MidJourney, Stable Diffusion, Dall-e and other AI image generators are, at least partially, plagiarizing their content.

On the other hand, the defendants have argued that AI models scrape the web to catalog images but not copy them, in the same way a person has to look through a set of pictures by Pablo Picasso in order to identify what makes a Picasso distinct. In this way, styles cannot be copyrighted. AI outputs are not copies of original artworks and the data was publicly available to be seen—by people or computers.

On the question of whether AI-generated images could constitute derivative works infringing the plaintiffs’ original creations, Orrick expressed skepticism. “I don’t think the claim regarding output images is plausible at the moment, because there’s no substantial similarity,” he said.

However, illustrator Sarah Andersen’s claim that Stability AI directly infringed copyrights she holds on several works seems likely to move forward, the judge indicated. This would not affect the output or the use of AI, instead it seems to tackle the use of another artist’s work for commercial gains.

Who Owns the Copyrights? An Age-old But Tricky Question

The debate about the copyrights over AI-generated works is not new. However, the current opinion expressed by judge Orrick seems to point to AI works being different from the data used to train the models, which would not give rights to the plaintiffs.

This view aligns with conclusions from legal scholars in decades past, when AI was less advanced. Amid the rise of computer-generated works in the 1980s and 90s, experts deemed allocating copyright to the AI system’s user the most prudent approach. This rewarded those bringing innovations to market while avoiding over-rewarding programmers.

For example, in 1985, proffesor Pamela Samuelsonm, from the Berkeley Law School and UC Berkeley’s School of Information, argued that “allocating rights in computer-generated output to the user of the generator program is the soundest solution to the dilemma.” In contrast, Victor Palace reached the conclusion that all AI artwork should enter into public domain, “Allocation of copyright ownership to the artificial intelligence would lead to nonhuman standing, which would lead to unnecessary uncertainty in the legal system,” he wrote in a paper for the Florida Law Review.

But today’s lightning-fast leaps in AI have renewed debate on the issue. Scientists can no longer dismiss systems like ChatGPT and Stable Diffusion as merely inert instruments “animated by elements of human creative genius,” as a congressional commision did decades ago. These tools now display increasing autonomy in generating written prose, images, music and more.

So who owns the output — the AI, the programmers, or the artists whose work trained the models? And could AI creations infringe on said training material, potentially constituting copyright infringement? Several pending lawsuits aim to provide legal clarity. Using copyrighted works to train AI may constitute copyright infringement, but fair use defenses could potentially apply.

The answers carry high stakes, shaping incentives and rewards across AI as it permeates sectors from education to entertainment. For now, Orrick’s skeptical view on copyright issues sends a preliminary signal as to how courts may treat these thorny AI lawsuits. But like any good legal drama, expect some plot twists before the credits roll.

Stay on top of crypto news, get daily updates in your inbox.





Source link

Be the first to comment

Leave a Reply

Your email address will not be published.


*