Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data

Salamendacious@lemmy.world to News@lemmy.world – 510 points –
Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data
venturebeat.com
118

You are viewing a single comment

Anyone who damages an AI model should be liable for the entire cost to purchase and train said model. You can't just destroy someone's property because you don't like how they use it.

So artists can't make certain art because some company's AI might get confused. Right then.

... If an artist doesn't want their art used, we already have a system in place for that. If that system needs expanding or change, then that is the discussion that should be had.

Laws are better than random acts of destruction.

They should just make it better, you know?

Hey guys, I've been dumpster diving and got food poisoning. Can I sue the business?

Maybe they should’ve thought about that before they integrated people’s content without consent????

The law would be the right response there.

Especially since malicious actors can very easily abuse the fuck out of this.

If you think there won't be a post right on fucking lemmy itself about infecting images then posting them on free repos because "lol fuck ai" then you're just not looking around, dude

I understand where you are coming, but most AI models are trained without the consent of those who's work is being used. Same with Github Copilot, it's training violated the licensing terms of various software licenses.

Then the response to that is laws not vigilantism

I agree, but those laws need to be enforced and there is no one doing it.