Amazon to require some authors to disclose the use of AI material

MicroWave@lemmy.world to News@lemmy.world – 196 points –
Amazon to require some authors to disclose the use of AI material
apnews.com

After months of complaints from the Authors Guild and other groups, Amazon.com has started requiring writers who want to sell books through its e-book program to tell the company in advance that their work includes artificial intelligence material.

The Authors Guild praised the new regulations, which were posted Wednesday, as a “welcome first step” toward deterring the proliferation of computer-generated books on the online retailer’s site. Many writers feared computer-generated books could crowd out traditional works and would be unfair to consumers who didn’t know they were buying AI content.

In a statement posted on its website, the Guild expressed gratitude toward “the Amazon team for taking our concerns into account and enacting this important step toward ensuring transparency and accountability for AI-generated content.”

22

Considering what a total wasteland amazon's self published section is, i don't know that it could be much worse.

Of course any author with an IQ over 70 would have the good sense to never disclose they were using AI.

What's worse is AI generated books on mushrooms, etc, which can be literally deadly (and yes such books has already been published!)

Darwin in action. Anyone who'd use a guidebook to figure out what mushrooms to eat is gonna have a bad time regardless, it's not really something you can sum up in a book in a safe way.

My mother was originally going to self-publish her first novel on Amazon, but she realized what a scam it was. I'm glad she found a real publisher. She's on her sixth book now. They aren't bestsellers or anything, but people actually buy and read them. eBooks and physical copies.

This is absolutely a good move, though I don't know how effective it'll be on its own. Unfettered AI garbage "content" is soon going to flood every storefront and service around, and the only way to really solve it is to close things down and move to more highly curated platforms. I wish that wasn't the case, but I can image a future where it's hard to find anything worthwhile in a sea of AI-generated junk.

some?

Hopefully the mushroom foraging guides. Amazon is going to have a pretty big lawsuit if AI hallucinations start mixing up Death Caps with choice edible mushrooms and get someone killed.

Good. Unless you're using an AI you yourself cultivated using your own creations: you're plagiarizing with extra steps.

isn't everything just plagiarising with extra steps

Taking inspiration from something is different than creating a Frankensteins monster. AIs replicate, they do not create.

That's not actually how generative AI works. LLMs don't copy/paste material, unless deliberately instructed to. And even then, most are coded in a way that it will still not reproduce it's training material word-for-word.

Yes, change a few words here and there: it totally isn't plagiarism!

I'm not arguing this with people who have likely never created anything other than code.

Again, not how LLMs work. Maybe before you decide who you do and don't argue with, you should decide if you even should argue something you don't understand in the first place.

You both owe Rick and Morty royalties

You joke, but you bring up an excellent point as to why I dislike the "AI is plagiary" argument that I see a lot of these days.

Everything is plagiarized in some way. No thought is truly original. Unless you spend your whole life with zero contact to anyone or anything and consume zero media of any form (in which case, have fun conveying your original thoughts with the language you've had to invent for yourself that nobody else could possibly translate), then every idea is based on another idea before it. Every single thought has an inspiration behind it. LLMs aren't just copy/pasting content; the actual logic behind generative content production is incredibly similar to how people form thoughts and ideas of their own.

That said, if you're writing a book using AI, I'd argue a case for laziness more than plagiary. Though I don't see an inherent problem in using AI to help write a story. But if the whole book is AI-generated, I can't imagine it will be good enough to sell enough to justify the time and effort it takes to produce that amount of text and have it published, so I wouldn't foresee it being a very widespread problem just yet.

3 more...
3 more...
3 more...

Overly simplistic outlook.

If you provide the sources and direct the LLM to use those sources, and then proofread the damn thing and cite the sources, it flat out is not plagiarism.

It's as much plagiarism as using a calculator is to find square roots.

3 more...