Are the days of generative AI numbered? A tool for “poisoning” datasets has achieved unexpected popularity

Free tool Nightshade, created by researchers at the University of Chicago, was downloaded 250,000 times in its first 5 days of existence. The program is intended for digital artists who do not want to allow their images to be used by generative AI. It seems that if the tool is so successful, the development of future models may become much more difficult.

Ben Zhao, project leader and professor of computer science, wrote in a letter to VentureBeat reporters that even he did not expect such enthusiasm. “This reaction is beyond anything we could have imagined. Apparently this is a very serious problem for many people.”

This seems like a very good start for a free tool, and it demonstrates the strong desire among digital content creators to protect their work from being used in teaching AI without their consent. In the USA alone there are more than 2.67 million artists, but Zhao says Nightshade’s user base is likely even wider. Many downloads come from other countries, and even from some ordinary users who are trying to protect their photos from social networks in this way.

How Nightshade works and why it is popular

I wrote about this tool in October when it was first introduced to the public. Nightshade is designed to poison AI’s generative image models, altering images at the pixel level so that a machine learning algorithm appears to contain completely different content—say, cats instead of dogs, or cows instead of cars. Algorithms don’t understand what’s in the pictures they use. And the more such “poisoned” pictures they add to their database, the worse the results come out later. When creating test generative AIs, just 50-100 “toxic” images were enough for the model to start producing completely useless, clearly incorrect results.

On project page Zhao and his colleagues say they developed the tool to “increase the cost of training AI on unlicensed data so that licensing images from their creators becomes a more profitable alternative.”

Shortly after Nightshade’s release in late January 2024, demand for downloads of it became so high that the University of Chicago servers stopped coping with a stream of requests. The creators had to add mirror links so that people could download copies of the tool from other clouds.

At the moment, the total number of downloads is already approaching one million. Meanwhile, the team’s other, earlier tool is Glazewhich prevents AI from copying artists’ signature styles by quietly changing the pixels of an image, has been downloaded 2.2 million times since its release in April 2023.

What’s next for the Nightshade team?

Will there be anything after Nightshade? Perhaps an even stronger “poison” for generative AI?

Under the collective name The Glaze Project, Zhao and his fellow researchers have previously announced their intention to release a tool that combines both Glaze (for defense) and Nightshade (for offense). On the one hand, it will prevent style scanning, and on the other hand, it will actively harm AI that dares to include a “protected” image in its dataset. Researchers say they will be able to release such a product in at least a month.

Glaze's work: on the left, an attempt to copy the artist's style is in principle successful, on the right - not

Glaze’s work: on the left, an attempt to copy the artist’s style is in principle successful, on the right – not

“We just have a lot of things on our list right now,” Zhao wrote. — The merged version must be thoroughly tested. So I think we will need at least a month, maybe more, to conduct comprehensive analyses.”

Scientists at The Glaze Project advocate that artists should now use Glaze first in their images and then Nightshade, to both protect their style and disrupt the learning of AI models. And they say they’re now encouraged to see users doing just that—even if they’re a little awkward using two separate programs.

“We warned people that we haven’t done full tests yet, and that people should wait before posting any images with just Nightshade protection,” Zhao explained. “In response, the artist community said: “No problem! We will simply process images with Nightshade and Glaze in two stages! We were very encouraged by this.”

There are also plans to release an open source version of Nightshade. So that the community interested in copyright protection issues can take over the development of the project in the future.

Ben Zhao notes that he and his colleagues have never had contact with the creators of generative AI, such as OpenAI, Midjourney and Stability AI. And he doubts that such contacts are possible in the future. He also isn’t worried about the possibility of lawsuits against him and his team for harming the work of AI models. “We’re just giving everyday people what they want: the ability to protect their income and their creativity.”

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *