Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data

Salamendacious@lemmy.world to News@lemmy.world – 510 points –
Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data
venturebeat.com
118

You are viewing a single comment

I understand where you are coming, but most AI models are trained without the consent of those who's work is being used. Same with Github Copilot, it's training violated the licensing terms of various software licenses.

Then the response to that is laws not vigilantism

I agree, but those laws need to be enforced and there is no one doing it.