March 25, 2025
Trending News

Nightshade poisons works of art in the fight against the AI

  • January 29, 2024
  • 0

Artists’ artworks are deliberately altered online to make them unusable for AI models. Not to attack AI, but to alert the tech giants to artists’ copyrights. AI models

Nightshade poisons works of art in the fight against the AI

Artists’ artworks are deliberately altered online to make them unusable for AI models. Not to attack AI, but to alert the tech giants to artists’ copyrights.

AI models often use copyrighted material without paying for licenses. A new project called Nightshade aims to protect artists from this. This new technique “poisons” works of art, making them unusable for AI models. This project aims to convince tech giants to pay licenses for copyrighted works. It explores how Nightshade can work with an existing similar tool (Glaze) to provide optimal protection for artists.

Copyright and AI

Copyright and AI models do not yet go hand in hand. Many artists want to prevent their works from being used by AI models, although it is not that easy. For many artists, staying remote online is not an option, so new ways are being developed to protect their works from AI online.

In this context, the University of Chicago launched the Nightshade project. This is a new technique that poisons artists’ works, making them unusable for AI models. According to project leader Ben Zhao, this is not a way to turn off the AI. The researchers want to force the tech giants to pay for work with a license. This project also shows that these AI models are vulnerable and that there are indeed ways to attack them.

“Poisoned” works of art

The Nightshade project will subtly change pixels in images so that AI models interpret this error. The AI ​​models incorrectly categorize the edited parts of these images and thus train themselves on “poisoned” data. This leads to incorrect interpretation and incorrect results of the prompts. An example of this is the Mona Lisa, which had some changes made to the pixels that the AI ​​read as a cat’s head. As a human, you can’t tell the difference (to clarify: the image at the top of this article is for illustrative purposes, but is not an example of Nightshade).

Zhao previously worked on a similar project, Glaze. This is a cloaking tool that distorts the way AI models “see” and determines artistic style. For example, artists’ original works cannot be imitated. The ideal formula to optimally protect works of art from AI models would be to combine Nightshade with Glaze. The Washington team is currently testing how both techniques can be integrated into an integrated tool.

Source: IT Daily

Leave a Reply

Your email address will not be published. Required fields are marked *