Nightshade registers 250,000+ downloads within days of release

February 2, 2024

Nightshade AI

The University of Chicago’s data poisoning tool, Nightshade, amassed over 250,000 downloads within just five days of its debut on January 18, 2024. 

Designed to empower visual artists in the fight against unauthorized use of their work by AI models, Nightshade offers a novel approach by “poisoning” models, effectively disrupting their ability to learn from the images they’re trained on.

This process operates by subtly altering images at the pixel level, making them appear different to AI models without compromising their visual integrity to the human eye. 

When an AI model is trained on poisoned images, its outputs become flawed, effectively protecting artists from data scraping over the long term.

AI developers will want to avoid hurting their models by training on poisoned images, so this places the ball in their court.

They’ll either engineer a solution or change their data scraping practices to avoid copyright works likely to be poisoned. 

Nightshade works best when used at scale – the more poisoned images that find their way into an AI model, the stronger the impact. 

If, in theory, every image in a dataset was poisoned, then the AI model would struggle to generate anything coherent – its outputs would be a glitchy mess. 

On a lower scale, poisoned images manipulate the model into learning false associations, meaning a prompt of a “cat on a cloud” might produce something that looks like “a handbag in a field.” 

While AI companies already have vast quantities of data at their fingertips, if they fail to build new, contemporary datasets, then models will eventually lose touch with reality.

Ben Zhao, the project leader and a computer science professor at the university, expressed astonishment at the tool’s reception, stating to VentureBeat, “I still underestimated it. The response is simply beyond anything we imagined.” 

This was echoed in a team statement on social media, which highlighted the unprecedented demand: “The demand for Nightshade has been off the charts,” they said, noting that the flood of global downloads had momentarily overwhelmed the university’s network infrastructure, which they initially thought could be a cyber attack. 

The tool is the latest in a series of efforts by the team, following Glaze, another tool aimed at misleading AI models about an artwork’s style. The developers are currently exploring ways to integrate Nightshade and Glaze, offering a layered defense for digital content.

Nightshade is fairly complex and resource-demanding, requiring a chunky computer to run.

Poisoning multiple images is also time-consuming, but once it’s done, it’s done. The user interface for Nightshade, while not particularly eye-catching, is pretty intuitive all things told, guiding users through the process with straightforward settings and rendering outputs within a timeframe of 30 to 180 minutes, depending on the chosen parameters. 

Artists and tech enthusiasts are using the tool, with some advocating for the widespread use of Nightshade on all personal digital content to dilute the data pool for AI models. Strength in numbers, as they say. 

The fightback from artists against AI developers using copyright data has been immense this year, with artists from all quarters of the creative spectrum rallying on X and other social media platforms, heeding the clarion call of advocates like Reid Southen and Jon Lam.

It’s not just artists that AI developers have to worry about – they’re also fighting a series of copyright lawsuits and battling increasing energy demands and impending semiconductor shortages. 

While Nightshade still presents some complexity in its use, there are plans to integrate it with the social and portfolio application Cara, currently in open beta, hinting at a wider adoption.

Eventually, Nightshade could be hosted fully in the cloud, making it highly accessible. 

As the digital landscape continues to evolve, tools like Nightshade represent a technical solution and a symbol of the ongoing struggle for control and respect within the AI era. 

This is another ‘AI race’ in an industry strung with tension right now.

Join The Future


SUBSCRIBE TODAY

Clear, concise, comprehensive. Get a grip on AI developments with DailyAI

Sam Jeans

Sam is a science and technology writer who has worked in various AI startups. When he’s not writing, he can be found reading medical journals or digging through boxes of vinyl records.

×

FREE PDF EXCLUSIVE
Stay Ahead with DailyAI

Sign up for our weekly newsletter and receive exclusive access to DailyAI's Latest eBook: 'Mastering AI Tools: Your 2024 Guide to Enhanced Productivity'.

*By subscribing to our newsletter you accept our Privacy Policy and our Terms and Conditions