Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data

Salamendacious@lemmy.world to News@lemmy.world – 510 points –
Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data
venturebeat.com
118

You are viewing a single comment

Please quote me the part where I said or implied I hate libraries.

You've been defending people's right to burn them down all thread.

I have, at no point, implied legitimate businesses/organizations should be burned down.

No you implied that other structures of communal data storage should be burned down; but for some reason you disingenuously disassociate those to others because of some self-perceived definition over what is legitimate knowledge.

AI is a librarian, datasets are the library. You want to set fire to the stacks, fuck everyone else's hard work.

If AI were a librarian I'd be able to go there and find the works of whatever artists they kept copies of. Last I checked, that is not the case. Are you sure you know how these machine learning algorithms work?

If the AI is set fire to, all of the artists still keep their individual works or art on whatever websites they were before they got scraped.

If you can't define their individual works are they really there?

I can read and absorb Nietzsche, yet you won't find his books in my head.