Artists may soon have a new weapon to help protect their work from becoming fodder for machine learning. The tool, called Nightshade, makes tiny changes to the pixels in a digital work of art to effectively “poison” it, rendering the image unusable for the purposes of AI training.
MIT Technology Review reports that a team led by Professor Ben Zhao of the University of Chicago submitted Nightshade for peer review at the USENIX computer security conference. The software works by making small changes to an image that, while invisible to the human eye, cause AI algorithms to completely misidentify it.
For example, an artist can paint a picture of a cat that can be clearly identified as a cat by any human or AI that examines it. However, after using Nightshade, humans still see the same image, while the AI incorrectly believes it is a dog.
Flood the AI with enough bad training material like this, and soon a request for a picture of a cat will cause it to generate a dog instead.
OpenAI is working on a tool to recognize DALL-E 3 AI-generated images
Of course, it is unlikely that just a single poisoned image will have a significant impact on an AI image generator’s algorithm. Its training data would have to be distorted by thousands of altered images before a real effect could be seen.
However, AI image generators have been known to randomly collect thousands of new samples from the internet to refine their algorithm. If enough artists upload their images using Nightshade, such AI tools could eventually become unusable.
Additionally, it would be incredibly difficult for AI companies to fix the problem, as each poisoned image would have to be individually identified and removed from their training pool. This could create a strong incentive for such companies to think twice before dragging a net across the Internet and using artists’ work without their express consent.
This isn’t the first AI-disrupting tool Zhao’s team has developed. The group was previously released glaze, a tool that similarly obscures an artist’s personal style. Nightshade will eventually be integrated into Glaze and made open source so others can build on their work and protect artists.