Nightshade: The Innovative Free Tool Empowering Artists to ‘Poison’ AI Models Is Now Accessible

Nightshade: The Innovative Free Tool Empowering Artists to ‘Poison’ AI Models Is Now Accessible

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More.

Months after its initial announcement, Nightshade, a free software tool that allows artists to “poison” AI models training on their work, is now available for download. Created by computer scientists at the University of Chicago’s Glaze Project under Professor Ben Zhao, this tool essentially pits AI against AI. Using the popular open-source machine learning framework PyTorch, Nightshade identifies what’s in an image and applies a tag that subtly alters the image at the pixel level. This ensures that other AI programs see something entirely different than what is actually there.

Nightshade v1.0 is ready for use, with performance tuning and UI fixes completed. You can download it from the provided link and read the what-is page and User’s Guide for instructions, as it is more complex to use than Glaze.

Nightshade is the second tool from the team, following Glaze, which was designed to alter digital artwork to confuse AI training algorithms about the style of an image. While Glaze serves as a defensive tool, Nightshade is intended to be offensive. An AI model trained on many Nightshade-altered images might misidentify objects, even in images not altered by Nightshade.

Artists using Nightshade must have a Mac with Apple chips (M1, M2, or M3) or a PC running Windows 10 or 11. The tool is downloadable for both operating systems, with the Windows version capable of running on supported Nvidia GPUs. Due to high demand, some users have reported long download times.

Users must agree to the Glaze/Nightshade team’s end-user license agreement (EULA), which prohibits modifying the source code or using the software for commercial purposes. Nightshade v1.0 transforms images into “poison” samples, causing AI models to learn unpredictable behaviors, like generating an image of a handbag instead of a cow when prompted.

Nightshade’s effects remain resilient to typical image transformations, such as cropping or resampling, ensuring the poison effect persists. The team clarifies that their goal is not to break AI models but to increase the cost of training on unlicensed data, encouraging AI developers to license images from creators.

The creation of Nightshade addresses the issue of AI models being trained on data scraped from the internet without artists’ consent, a practice that threatens their livelihoods. AI model makers argue that data scraping is necessary and lawful under “fair use.” However, the Glaze/Nightshade team notes that opt-out lists are often ignored, making Nightshade a necessary tool to level the playing field.

Used responsibly, Nightshade can deter model trainers who disregard copyrights and opt-out directives. While it can’t reverse past data scraping, it aims to make future scraping more costly and encourage licensing agreements with artists.

Stay in the know! Get the latest news in your inbox daily. Subscribe to VentureBeat’s newsletters. By subscribing, you agree to VentureBeat’s Terms of Service.