April 28, 2025
Trending News

Nightshade, the “poison” for image-generating AI

  • October 25, 2023
  • 0

Although currently in the project phase, Nightshade aims to be an effective solution to the problem for some time now condemned to the point of complete exhaustion by

Nightshade, the “poison” for image-generating AI

Although currently in the project phase, Nightshade aims to be an effective solution to the problem for some time now condemned to the point of complete exhaustion by designers, photographers, painters, etc. I am talking, as you probably already imagined, about a use that is not expressly allowed, so that their creations are used in the training processes of generative artificial intelligence models that are nowadays so current.

A few weeks ago, at the presentation of DALL-E 3, OpenAI has confirmed that it will use the model opt-out. What does it mean? By default, they consider themselves authorized to train their models with third-party creations, but offer them the option to request that their creations not be used for this purpose. Of course, this is better than nothing, but the ideal (though of course more complicated) would be the opposite, a model Log inin which technology companies will only use the data they are expressly authorized to use.

And in models that at least offer this possibility to creators, because we can assume it In some cases, this possibility is not even considered.. In these cases, creators are exposed to the publication of their work, by any means, automatically and inevitably making it subject to ingestion by human-administered algorithms with little or no respect for intellectual property. And here comes Nightshade.

Nightshade, "poison" for generative AI

Image: MIT Technology Review

Designed by a team from the University of Chicago, Nightshade allows you to “distort” image files so they aren’t visually affected, but it fools the models, causing misinterpretations of its content. In this way, the more “poisoned” images Nightshade uses to train the model, the more likely it is to fail later when it is used to create new images.

Nightshade is currently undergoing peer review and if it passes this stage, this technology will be part of Glaze, a previous development created by the same team leader and University of Chicago professor, Ben Zhao. Glaze uses a similar technique and makes adjustments to image files that are not perceptible to the naked eye, but that substantially change what the model perceives when trained with the given images.

The success of Nightshade and Glaze will of course depend on if adopted by a critical mass of creatorsbecause a significant number of “poisoned” files are required for this technique to have a sufficient impact on the model’s performance.

More information

Source: Muy Computer

Leave a Reply

Your email address will not be published. Required fields are marked *