This new data poisoning tool lets artists fight back against generative AI.
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Read in full here:
This thread was posted by one of our members via one of our news source trackers.