
I'm a disabled neurodivergent writer who currently lives and works in Portland, OR.
the folks who made Glaze, which lets you protect your art from being reproduced by e.g. Stable Diffusion and other text-to-image programs, have made a new tool called Nightshade. Nightshade works similarly to Glaze - by making adjustments to images that are imperceptible to users but which modify an "ai art" program's ability to ingest them - but instead of simply preventing reproduction, Nightshade actually "poisons" training data, creating bad associations between words that make the tool less useful. The stated goal is to disincentivize mass image scraping, and to encourage the companies that make these tools to actually get permission for the images they train their models on.
Neat! Fuck 'em!
you try to talk to this bitch and she will not shut up about this stupid penguin game