Image-generating AI seems to be stuck between a rock and a hard place. To work well, it needs a massive treasure trove of well-annotated artwork to train it. That could get expensive, but it's free if you take it from the internet without asking. That latter bit has artists understandably upset, and the new Nightshade tool might give them the means to fight back.
Besides Shutterstock and Adobe, which have access to a large set of licensed artwork, most image-generating AI scrapes data from the internet to build learning models. It's a "better to ask forgiveness than permission" way of doing things, only without even asking forgiveness. But without that process, DALL-E, Stable Diffusion, Midjourney, and others wouldn't work as well as they currently do.
That leads to an ethical question: is it okay to use someone's art to train AI art generators without their permission? And if so, where is the line drawn? Sometimes, these data sets are for pure research, with no profit motive. In others, the intent is purely commercial, which means a company will benefit from an artist's work without ever compensating the artist.
Unfortunately for an artist who doesn't want their work used for AI at all, opting out is difficult. Not all companies have an opt-out option (such as Meta, despite earlier reports otherwise), and those that do will often promise to only remove the art from future learning models, not those that exist already.
That's where Nightshade comes in. As first reported by MIT Technology Review, Nightshade builds on earlier work by the University of Chicago to give artists a choice in how their artwork is used. The original version, Glaze, hid art styles from AI. While it may recognize that the art piece is a dog, forRead more on pcmag.com