31
Artists can poison their pics with deadly Nightshade to deter AI scrapers
(www.theregister.com)
This community serves to share top posts on Hacker News with the wider fediverse.
Rules
0. Keep it legal
This is the best summary I could come up with:
University of Chicago boffins this week released Nightshade 1.0, a tool built to punish unscrupulous makers of machine learning models who train their systems on data without getting permission first.
"Nightshade is computed as a multi-objective optimization that minimizes visible changes to the original image," said the team responsible for the project.
Nightshade was developed by University of Chicago doctoral students Shawn Shan, Wenxin Ding, and Josephine Passananti, and professors Heather Zheng and Ben Zhao, some of whom also helped with Glaze.
"Nightshade can provide a powerful tool for content owners to protect their intellectual property against model trainers that disregard or ignore copyright notices, do-not-scrape/crawl directives, and opt-out lists," the authors state in their paper.
The failure to consider the wishes of artwork creators and owners led to a lawsuit filed last year, part of a broader pushback against the permissionless harvesting of data for the benefit of AI businesses.
Matthew Guzdial, assistant professor of computer science at University of Alberta, said in a social media post, "This is cool and timely work!
The original article contains 704 words, the summary contains 174 words. Saved 75%. I'm a bot and I'm open source!