Google says AI systems should be able to mine publishers’ work unless companies opt out, turning copyright law on its head

0x815@feddit.de to Technology@beehaw.org – 371 points –
Google says AI systems should be able to mine publishers’ work unless companies opt out
theguardian.com

In its submission to the Australian government’s review of the regulatory framework around AI, Google said that copyright law should be altered to allow for generative AI systems to scrape the internet.

177

You are viewing a single comment

Personally I’d rather stop posting creative endeavours entirely than simply let it be stolen and regurgitated by every single company who’s built a thing on the internet.

I just take comfort in the fact that my art will never be good enough for a generative Ai to steal.

If it's on any major platform, these companies will probably still use it since I doubt at that point if they were allowed to scrape the whole internet they'd have any human looking over the art used.

It'll just be thrown in with everything else similar to how I always seem to find paper towels in the dryer after doing laundry.

Then I take comfort in the fact it might serve to sabotage whatever it generates.

"Bad" art is still useful in training these models because it can be illustrative of what not to do. When prompting image generators it's common to include "negative prompts" along with your regular one, telling the AI what sorts of things it should avoid putting in the output image. If I stuck "by Roundcat" into the negative prompts it would try to do things other than the things you did.

I think the topic is more complex than that.

Otherwise you could say you'd rather stop posting creative endeavours entirely than simply let it be stolen and regurgitated by every single artist who use internet for references and inspiration.

There's not only the argument "but companies do so for profit" because many artist do the same, maybe they are designers, illustrators or other and you'll work will give them ideas for their commissions

11 more...

Voluntary obscurity is always an option, I suppose.

We need to actively start sabotaging the data sources these LLMs are based on. Make AI worthless.

Your comment right here provides useful training data for LLMs that might use Fediverse data as part of their training set. How would you propose "sabotaging" it?

11 more...