Home Artificial Intelligence This recent data poisoning tool lets artists fight back against generative AI

This recent data poisoning tool lets artists fight back against generative AI

0
This recent data poisoning tool lets artists fight back against generative AI

COURTESY OF THE RESEARCHERS

Zhao admits there’s a risk that individuals might abuse the info poisoning technique for malicious uses. Nonetheless, he says attackers would want hundreds of poisoned samples to inflict real damage on larger, more powerful models, as they’re trained on billions of information samples. 

“We don’t yet know of strong defenses against these attacks. We haven’t yet seen poisoning attacks on modern [machine learning] models within the wild, however it might be only a matter of time,” says Vitaly Shmatikov, a professor at Cornell University who studies AI model security and was not involved within the research. “The time to work on defenses is now,” Shmatikov adds.

Gautam Kamath, an assistant professor on the University of Waterloo who researches data privacy and robustness in AI models and wasn’t involved within the study, says the work is “implausible.” 

The research shows that vulnerabilities “don’t magically go away for these recent models, and actually only develop into more serious,” Kamath says. “This is very true as these models develop into more powerful and other people place more trust in them, for the reason that stakes only rise over time.” 

A robust deterrent

Junfeng Yang, a pc science professor at Columbia University, who has studied the safety of deep-learning systems and wasn’t involved within the work, says Nightshade could have a huge impact if it makes AI firms respect artists’ rights more—for instance, by being more willing to pay out royalties.

AI firms which have developed generative text-to-image models, equivalent to Stability AI and OpenAI, have offered to let artists opt out of getting their images used to coach future versions of the models. But artists say this is just not enough. Eva Toorenent, an illustrator and artist who has used Glaze, says opt-out policies require artists to leap through hoops and still leave tech firms with all the ability. 

Toorenent hopes Nightshade will change the establishment. 

“It’s going to make [AI companies] think twice, because they’ve the opportunity of destroying their entire model by taking our work without our consent,” she says. 

Autumn Beverly, one other artist, says tools like Nightshade and Glaze have given her the arrogance to post her work online again. She previously removed it from the web after discovering it had been scraped without her consent into the favored LAION image database. 

“I’m just really grateful that now we have a tool that may help return the ability back to the artists for their very own work,” she says.

LEAVE A REPLY

Please enter your comment!
Please enter your name here