Created by researchers at the University of Chicago, Nightshade is an “offensive” tool that can protect artists’ and creators’ work by “poisoning” an image and making it unsuitable for AI training, writes Alfonso Maruccia.
Here is the link: https://www.techspot.com/news/101600-nightshade-free-tool-thwart-content-scraping-ai-models.html