Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data

Salamendacious@lemmy.world to News@lemmy.world – 510 points –
Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data
venturebeat.com
118

You are viewing a single comment

The law would be the right response there.

Especially since malicious actors can very easily abuse the fuck out of this.

If you think there won't be a post right on fucking lemmy itself about infecting images then posting them on free repos because "lol fuck ai" then you're just not looking around, dude

I understand where you are coming, but most AI models are trained without the consent of those who's work is being used. Same with Github Copilot, it's training violated the licensing terms of various software licenses.

Then the response to that is laws not vigilantism

I agree, but those laws need to be enforced and there is no one doing it.