In August of this year, Stability AI open sourced an AI model called Stable Diffusion, which can generate corresponding images based on text given by users. The principle of Stable Diffusion is to obtain the ability to generate images by “learning” a large number of image datasets grabbed from the Internet.

This AI has caused quite a bit of controversy in the art circle, especially among painters. Some people even bluntly said that “AI will take away our jobs.” Many artists began to resist AI’s entry into the field of art, and began to place “No AI Art” images on their personal homepages to widely protest against AI-generated works of art.

Last Wednesday, Stability AI AnnounceAllows artists to remove their own work from the training dataset for Stable Diffusion 3.0 (coming soon). Artists need to find their work on the Have I Been Trained website and then opt out of the training set.

The editor uploaded a picture casually to search, and found a lot of anime/girl related images on this site. But if I want to select some photos out of the training set, I need to register an account. (By the way, the account password of this website requires uppercase and lowercase + numbers + special characters, which is very strict)

After registering and logging in, I tried to select some images to “exit training”, there is a big problem with this part – I don’t need to prove “this image is my work” at all, the site can’t verify the user’s ownership of the image work .

There is another problem, if you want to remove your own image work from training, it must already be in the LAION dataset and must be searchable on the Have I Been Trained website, but there may be multiple copies of the same image in other images or datasets copies. In addition, after being actively deleted, this dataset may grab images of your works from the Internet again.

This involves “how to constrain Stability AI to grab data sets from the Internet”. At present, Stability AI operates within the legal scope of the United States and Europe. The image data for training Stable Diffusion is basically directly collected from the Internet, without original Author’s permission.

Some people pointed out that Stability AI’s “opt-out” mechanism does not comply with the European General Data Protection Regulation, which stipulates that the capture of information requires the active consent of the original author, rather than the default assumption.Along these lines, manythinkAll works of art should be excluded from the AI ​​training set by default, and artists who want to dedicate themselves to AI can choose to add their own works to the AI ​​​​data set.

The jury is still out on the issue, and it’s clear that the Have I Been Trained website that’s launched is not very popular.But at least Emad Mostaque, CEO of Stability AI, is very friendly to the ethical controversy between AI and art, and is also open to the artists’ advocacy and suggestions.Twitterwrote:

Our teamVery open to feedback in hopes of building better datasets for all.

From our perspective, we believe this is a transformative technology and are happy to engage with all parties and be as transparent as possible.

Everything is developing and maturing rapidly.



#Stable #Diffusion #Artists #Remove #Work #Datasets #News Fast Delivery

Leave a Comment

Your email address will not be published. Required fields are marked *