Stable Diffusion 3 to Let Artists Opt Out

Analyst: It still will not solve problems of IP violation long term

Deborah Yao, Editor

December 19, 2022

2 Min Read

Stability AI will now let artists opt out of the training dataset for the next version of its popular text-to-image generator Stable Diffusion, according to an artists’ group.

At least, for about the next two weeks.

The window closes when Stable Diffusion 3 starts its next training, according to Mat Dryhurst, co-founder of Spawning, which is behind the ‘Have I been trained?’, a website that lets artists check whether their art or likeness is in training datasets.

“We do not want any artists present in these models who do not want to be there,” he said. “This has the potential to set a pretty remarkable precedent in the space. (Protecting rights to) the artist’s concept is fundamental. It’s a prerequisite for this space to grow in a way that everyone feels good about.”

To opt out of Stable Diffusion 3, go to HaveIBeenTrained.com and follow instructions.

But will the dataset still be robust? Stable AI spokeswoman Cristina Pena told AI Business that it will remain expansive. "We have a subset of roughly two billion images, and you really only need 10 million good images in order to create diverse, generative AI results."

Launched in September, Spawning was co-founded by Dryhurst, an artist and researcher who teaches music at New York University, along with musician and wife Holly Herndon. Spawning is “looking to build tools for artists to take ownership of their training data in our AI ecosystem,” he said.

Related:Rage Against the (Text-to-Image AI) Machine

Dryhurst said “thousands” of artists have already opted out. Spawning is planning to launch tools in 2023 that let artists “opt in.”

The group is working towards making “opt in” as the default setting for commercial AI applications going forward.

“Opt out lists, although imperfect, are a step in the right direction and have been a hard fought concession,” Dryhurst added. This is a first step that could one day “establish terms for future protocol of consent.”

But Mark Beccue, principal analyst of AI and NLP at sister research firm Omdia, pointed out that artists having to opt out represents a reversal of the consent process. “Why should original-rights owners have to register their work with a company that wants to use their IP? Shouldn’t it be the other way around?”

For Beccue, the situation is a simple case of IP violation. “This is just another form of copyright infringement; there is nothing new,” he said. “You are going to get sued, and you are going to lose because you are using (other people’s content) without permission.”

"There is plenty of precedent and case law to protect and support copyrights and IP."

Related:Stable Diffusion 2 is Here: What’s New?

About the Author(s)

Deborah Yao

Editor

Deborah Yao runs the day-to-day operations of AI Business. She is a Stanford grad who has worked at Amazon, Wharton School and Associated Press.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like