top of page
Anushka Sakorikar

Nightshade: A Tool to Fight Generative AI Models

Generative AI models are engulfed in controversy as these systems derive training data from the human artists’ work without consent, credit or compensation. It has reached the point where Stability AI is currently involved in a class action lawsuit for the billions of images that have been “scraped” from the internet for their AI models without the artists’ permission. So how are artists meant to protect their work?

Nightshade, developed by researchers in UChicago, is a tool created to shield the intellectual property of these artists. It works by exploiting a vulnerability in generative AI models, and “poisons” images by changing the pixels in the work in a way that’s invisible to the human eye. These poisoned data samples manipulate the models into creating the wrong match between image and text. Images of hats begin to be interpreted by the model as cats, or cars as cows. Outputs as a result are wonky and tough to remove, as each corrupted sample needs to be individually removed. As the corrupted images spread, this begins to render the model useless.

Nightshade is also currently being integrated into Glaze, a tool that uses machine learning algorithms to compute a set of minimal changes to artwork, disrupting what the Ai model sees. Artists can opt for Nightshade or simply ‘Glaze’ their work.

These tools, however, have their limitations. Sometimes the subtle changes aren’t very subtle on certain artworks with flat colors, and Glaze is ultimately not future-proof against the ever-evolving nature of AI. Nightshade also poses a risk of users abusing the data poisoning technique – though that would require thousands of poisoned samples for real damage to happen.

So while these technologies may not be perfect, they do act as a step in the right direction, and a deterrent in current times against the invasive nature of generative AI, bringing about positive change.

Bibliography

Brittain, Blake, and Blake Brittain. “Judge Pares down Artists’ AI Copyright Lawsuit against Midjourney, Stability AI.” Reuters, 30 Oct. 2023. www.reuters.com, https://www.reuters.com/legal/litigation/judge-pares-down-artists-ai-copyright-lawsuit-against-midjourney-stability-ai-2023-10-30/.

Glaze - What Is Glaze. https://glaze.cs.uchicago.edu/what-is-glaze.html. Accessed 21 Nov. 2023.

Stable Diffusion Litigation · Joseph Saveri Law Firm & Matthew Butterick. https://stablediffusionlitigation.com/. Accessed 21 Nov. 2023.

“This New Data Poisoning Tool Lets Artists Fight Back against Generative AI.” MIT Technology Review, https://www.technologyreview.com/2023/10/23/1082189/data-poisoning-artists-fight-generative-ai/. Accessed 21 Nov. 2023.

Comentarios


Top Stories

bottom of page