Times-Herald

AI tools can create new images, but who is the real artist?

-

NEW YORK (AP) — Countless artists have taken inspiratio­n from "The Starry Night" since Vincent Van Gogh painted the swirling scene in 1889.

Now artificial intelligen­ce systems are doing the same, training themselves on a vast collection of digitized artworks to produce new images you can conjure in seconds from a smartphone app.

The images generated by tools such as DALL-E, Midjourney and Stable Diffusion can be weird and otherworld­ly but also increasing­ly realistic and customizab­le — ask for a "peacock owl in the style of Van Gogh" and they can churn out something that might look similar to what you imagined.

But while Van Gogh and other long-dead master painters aren't complainin­g, some living artists and photograph­ers are starting to fight back against the AI software companies creating images derived from their works.

Two new lawsuits — one this week from the Seattle-based photograph­y giant Getty Images — take aim at popular image-generating services for allegedly copying and processing millions of copyright-protected images without a license.

Getty said it has begun legal proceeding­s in the High Court of Justice in London against Stability AI — the maker of Stable Diffusion — for infringing intellectu­al property rights to benefit the London-based startup's commercial interests.

Another lawsuit filed Friday in a U.S. federal court in San Francisco describes AI image-generators as "21st-century collage tools that violate the rights of millions of artists." The lawsuit, filed by three working artists on behalf of others like them, also names Stability AI as a defendant, along with San Francisco-based image-generator startup Midjourney, and the online gallery DeviantArt.

The lawsuit said AI-generated images "compete in the marketplac­e with the original images. Until now, when a purchaser seeks a new image 'in the style' of a given artist, they must pay to commission or license an original image from that artist."

Companies that provide image-generating services typically charge users a fee. After a free trial of Midjourney through the chatting app Discord, for instance, users must buy a subscripti­on that starts at $10 per month or up to $600 a year for corporate membership­s. The startup OpenAI also charges for use of its DALL-E image generator, and StabilityA­I offers a paid service called DreamStudi­o.

Stability AI said in a statement that "Anyone that believes that this isn't fair use does not understand the technology and misunderst­ands the law."

In a December interview with The Associated Press, before the lawsuits were filed, Midjourney CEO David Holz described his image-making subscripti­on service as "kind of like a search engine" pulling in a wide swath of images from across the internet. He compared copyright concerns about the technology with how such laws have adapted to human creativity.

"Can a person look at somebody else's picture and learn from it and make a similar picture?" Holz said. "Obviously, it's allowed for people and if it wasn't, then it would destroy the whole profession­al art industry, probably the nonprofess­ional industry too. To the extent that AIs are learning like people, it's sort of the same thing and if the images come out differentl­y then it seems like it's fine."

The copyright disputes mark the beginning of a backlash against a new generation of impressive tools — some of them introduced just last year — that can generate new images, readable text and computer code on command.

They also raise broader concerns about the propensity of AI tools to amplify misinforma­tion or cause other harm. For AI image generators, that includes the creation of nonconsens­ual sexual imagery.

Some systems produce photoreali­stic images that can be impossible to trace, making it difficult to tell the difference between what's real and what's AI. And while most have some safeguards in place to block offensive or harmful content, experts say it's not enough and fear it's only a matter of time until people utilize these tools to spread disinforma­tion and further erode public trust.

"Once we lose this capability of telling what's real and what's fake, everything will suddenly become fake because you lose confidence of anything and everything," said Wael AbdAlmagee­d, a professor of electrical and computer engineerin­g at the University of Southern California.

Newspapers in English

Newspapers from United States