Striking actor Stephen Fry says his voice was stolen from the Harry Potter audiobooks and replicated by AI

stopthatgirl7@kbin.social to Technology@lemmy.world – 800 points –
fortune.com

The actor told an audience in London that AI was a “burning issue” for actors.

164

I think it's important to remember how this used to happen.

AT&T paid voice actors to record phoneme groups in the 90s/2000s and have been using those recordings to train voice models for decades now. There are about a dozen AT&T voices we're all super familiar with because they're on all those IVR/PBX replacement systems we talk to instead of humans now.

The AT&T voice actors were paid for their time, and not offered royalties but they were told that their voices would be used to generate synthentic computer voices.

This was a consensual exchange of work, not super great long term as there's no royalties or anything and it's really just a "work for hire" that turns into a product... but that aside -- the people involved all agreed to what they were doing and what their work would be used for.

The ultimate problem at the root of all the generative tools is ultimately one of consent. We don't permit the arbitrary copying of things that are perceived to be owned by people, nor do we think it's appropriate to do things without people's consent with their "Image, likeness, voice, or written works."

Artists tell politicians to stop using their music all the time etc. But ultimately until we really get a ruling on what constitutes "derivative" works nothing will happen. An AI is effectively the derivative work of all the content that makes up the vectors that represents it so it seems a no brainer, but because it's radio on the internet we're not supposed to be mad at Napster for building it's whole business on breaking the law.

I don't think permits and concent alone can be used in labor relationship, because the unbalance position of power employees and employers have with each other. Could the workers really negotiate better working conditions? They really can't, not without an union anyway.

I think a more interesting (and less dubious) example of this would be Vocaloid and to a greater extent, cevio AI

Vocaloid is a synth bank where instead of the notes being musical instruments, they're phonemes which have been recorded and then packaged into a product which you pay for, which means royalties are involved (I think there might also be a thing with royalties for big performances and whatnot?) Cevio AI takes this a step further by using AI to better smooth together the phonemes and make pitching sound more natural (or not - it's an instrument, you can break it in interesting ways if you try hard enough). And obviously, they consented to that specific thing and get paid for it. They gave Yamaha/Sony/the general public a specific character voice and permission to use that specific voice.

(There's a FOSS voicebanks but that adds a different layer of complication to things like I think a lot of them were recorded before the idea of an "AI bank" was even a possibility. And like, while a paid voice bank is a proprietary thing, the open source alternatives are literally just a big file of .WAVs so it's much easier to go outside their intended purposes)

Studios basically want to own the personas of their actors so they can decouple the actual human from it and just use their images. There's been a lot of weird issues with this already in videogames with body capture and voice acting, and contracts aren't read through properly or the wording is vague, and not all agents know about this stuff yet. It's very dystopian to think your whole appearance and persona can be taken from you and commodified. I remember when Tupac's hologram performed at Coachella in 2012 and thinking how fucked up that was. You have these huge studios and event promoters appropriating his image to make money, and an audience effectively watching a performance of technological necromancy where a dead person is re-animated.

Did Tupac's estate agree? Or receive compensation?

Who cares if his estate agreed to it? HE didn't. His estate shouldn't have the right to make money off of things he never actually did.

Let the dead stay dead, it's just an excuse to not pay new, living artists.

That's literally how estates work.

Once I'm 6 feet under, if it could give my family a better life I'd say they should be able to agree to whatever they want on my behalf as long as it doesn't go against my will.

I know that legally they have the right. I'm saying they shouldn't have that right because reanimating a digital facsimile of your corpse just to puppet it to make money is fucked up. This includes shit like the CG Tarkin and Leia in Star Wars as well as the Tupac hologram

I'm saying it's not fucked up and I would be totally down for my family to do it

It just the beginning for sure. This future will be the end of artists and still everyone will clapping to AI productions like fools.

No one cared when spreadsheets replaced a huge chunk of office workers.

If the results are the same what's the issue.

Artists feel special because until recently computer couldn't automate them. But it's the same as any job.

The people who lost those jobs cared.

If not for the wages, people hardly have any attachment to most office jobs. But when it comes to artistic endeavors, a lot of people dream of being able to make a career in those fields. Frankly, that sort of comment itself seems like it comes from envy, like artists ought to be taken down a peg for daring to work with something they are passionate about. I couldn't think of a single artist who bragged about being above automation.

As someone who works in an office job, if AI could free me to work on something creative that would be wonderful, but if it will instead replace already existing creatives and leave us both without anywhere to work, that is not really helping anybody but executives profiting over it. What benefit does that even add to my life? Remixed porn? Meme generators? It's not the same level of benefit as industrial automation, if any. The human element of art enriches it in an unique way that AI trying to distill a style from countless samples won't be able to do.

This hits the nail on the head. A major component of art is that it's an outlet of human creativity, something we find fulfilling to both produce and consume. If creativity is delegated to machines, what's left for us humans? At some point, we'll grow tired of Taco Bell and re-runs, and what then?

Making art is something people enjoy, for one thing. Good art also has something of the artist in it, something to it other than "it was made from this prompt".

Art is just combining previously learnt techniques together with a specific subject. Since AI essentially knows all the techniques it could be better eventually.

Nothing is stopping people making art for fun.

Art is just combining previously learnt techniques together with a specific subject.

If that's what you look for in art then sure, but I disagree with that definition. A child's drawing of her dad has aspects to it that a picture of that dad taken in a photo booth can never have. A poem about war is much more meaningful when it comes from a refugee. The Wikipedia page for art lists several 'purposes' and most of them are not something AI art can ever fulfil.

You can't say ever. It could learn every diary and report from war ever and write amazing stuff. It's just a matter of time. It's currently limited by computer power quite significantly.

It could do that, but its writing would be hollow because those stories are meaningful due to the lived experiences behind them. For example anyone who's read The Diary of a Young Girl could write something similar in Anne Frank's style, but it wouldn't be nearly as impactful because learning about an event is very different from living through it.

It's limited by the trends of human art. The art and text AI that we have are based on pattern processing. They output what is expected based on what we feed it. They aren't able to come up with entirely new styles or philosophies. They don't even have a cognitive ability to have any philosophy. An AI describing a tree or depicting an image of a tree doesn't have an understanding of what a tree is, they are not aware of the world, they can only replicate human words and images.

A breakthrough needs to happen for them to be capable of anything more, but that's going to be its own can of worms.

It's not the same as any job. It's putting your face and your words behind something you cannot consent to. If someone spoofed your username and started posting offensive things, I've no doubt anyone would be upset. That's just your username. Now add your real life photo, your face, and your voice.

You would have to be a sociopath not to care if suddenly your friends and family received a video of you performing offensive acts or shilling for a political cause you are vehemently opposed to.

That's literally not how they work. They figure out a mathematical formula for generating things and apply it. Your analogy doesn't make any sense.

They aren't copying anything in reality. No more than the way an artist's brain changes when looking at other art.

In fact that is a much better analogy for how they work as they are modelled on our neurons.

2 more...
2 more...

AI will annihilate most data entry workers in the next few years as well.

AI is very stupid and breaks in ways you wouldn't expect

Some AIs are more intelligent than the average person.

Ask a normal person to do the tasks ChatGPT can and I bet the results would be even worse.

Ask chatGPT to do things a normal person can, and it also fails. ChatGPT is a tool, a particularly dangerous swiss army chainsaw.

I use it all the time at work.

Getting it to summerize articles is a really useful way to use it.

It's also great at explaining concepts.

It's also great at explaining concepts.

Is it? Or is it just great at making you think that? I've seen many ChatGPT outputs "explaining" something I'm knowledgeable of and it being deliriously wrong.

I agree. I have a very specialized knowledge in certain areas, and when I've tried to use chat GPT to supplement my work, it often misses key points or gets them completely wrong. If it can't process the information, it will err on the side of creating an answer whether it is correct or not, and whether it is real or not. The creators call this "hallucination."

Yeah it is if you prompt it correctly.

I basically use it instead of reading the docs when learning new programming languages and Frameworks.

That's great, it works until it doesn't and you won't know when unless you already are knowledgeable from a real source.

You know it doesn't work when you try and tell it doesn't work and it'll usually correct itself.

A coworker tried to use it with a well-established Python library and it responded with a solution involving a Class that did not exist.

LLMs can be useful tools but, be careful in trusting them too much - they are great at what I'd say is best described as "bullshitting". It's not even "trust but verify" it's more "be skeptical of anything that it says". I'd encourage you to actually read the docs, especially those for libraries as it will give you a deeper understanding of what's actually happening and make debugging and innovating easier.

Ive had no problem using them. The more specific you get the more likely they are to do that. You just have to learn how to use them.

I use them daily for refactoring and things like that without issue.

14 more...
16 more...
16 more...

"it wasn't me planning the terrorist attack over the phone, it was someone stealing my voice with an AI"

This is, unfortunately, the world we are about to be in.

I guess voice recordings will have as much value as text messages

His voice wasn't stolen, it's still right where he left it.

Fair enough. It's not theft, it's something else.

But that's just semantics, though.

The point is that his voice is being used without his permission, and that companies, profiteering people, and scammers will do so using his voice and the voices others. He likely wants some kind of law against this kind of stuff.

It's emotionally charged semantics.

It’s emotionally charging to hear your own voice saying things you did not. Dismissing a victim describing what happened because they’re emotional about how they were wronged doesn’t make sense to me.

How is this different from a human doing an impersonation?

Because it can be done fast, reliably and at scale.

This, and it's not a human. All these analogies trying to liken a learning algorithm to a learning human are not correct. An LLM is not a human.

Our entire society would collapse if we couldn't do things fast, reliably, and at scale.

Yes, but if “things” is replaced by scamming artists, that’s a shitty society

Artists aren't being scammed. They're being replaced by automated systems. It's the same thing that happened to weavers and glassblowers. The issue isn't that their job is being automated. It's that people replaced by automation aren't compensated. Blame the game, not the players.

3 more...
3 more...

I don't think it's a particularly odious mental challenge to understand that we're not upset about the general concept of doing things at scale, and that it depends on what the thing in question is.

For instance, you'd probably not be terribly upset about me randomly approaching you on the street once - mildly annoyed at most. You'd probably be much more upset if I followed you around 24/7 every time you entered a public space and kept badgering you.

3 more...
3 more...

You know what the difference is, trying to act otherwise is just being obtuse.

There was a difference between complete duplication and impersonation for the purposes of satire.

3 more...
3 more...

If you made a painting for me, and then I started making copies of it without your permission and selling them off, while I might not have stolen the physical painting, I have stolen your art.

Just because they didn't rip his larynx out of his throat, doesn't mean you can't steal someone's voice.

We're getting into samantics but it's counterfeit not stolen.

It would be more like if you made a painting for me, and I then used that to replicate your artistic style and used that to make new paintings without your permission and passed it off as your work.

Well, I just printed a picture of the Mona Lisa.

Did I steal the Mona Lisa? Or did I just copy it? Reproduce it?

You’re also not causing da Vinci to potentially miss out on jobs by copying it. You’re also not taking away his ability to say no to something he doesn’t want to be associated with.

That's fine. I'm not arguing this is a bad thing, I'm just being pedantic about the word theft.

Having your voice used to say things you didn't say is a terrifying prospect. Combined with deep faking takes it one step further.

But is it technically theft?

Yes, actually. In the same way as copyright infringement or identity theft could be considered so.

Bette Midler vs Ford

Wow the court obviously got this one wrong. Imitation is in no way stealing someone's voice.

Not to mention with billions of people walking around is anyone's voice really unique? I have met hundreds of people in my life who sound so much alike it is hard to distinguish them.

Your link didn't say anything about theft...

1 more...
1 more...
1 more...
1 more...

Jfc the pedantry.

I think it was a joke

No, the use of words matter when having a debate. "Theft" is an emotionally charged word that has a lot of implications that don't actually map well to what's going on here. It's not a good word to be using for this.

Seems to map pretty well. I’ve looked up a handful of definitions of theft and looking at it from an emotionless perspective it seems to fit. To take something without permission or the right to. I don’t really see where the removal of a finite resource is required.

Thats why I figured that comment was just a dad joke.

When you steal something the person you stole it from doesn't have it any more. That's why copyright violation is covered by an entirely different set of laws from theft.

This isn't even copying, really, since the end result is not the same as anything in the source material.

Lots of people may want it to be illegal, may want to call it theft, but that won't make it so when they take it to court.

“When you steal something the person you stole it from doesn't have it any more.”

Idk “identity theft” is a crime but you don’t actually remove the persons identity from them either. And also digital reproduction like in the case of piracy doesn’t remove a copy from the author but that is also illegal and is also considered theft. So I'm not really sure where you’re getting this idea that something isn’t both considered theft and a crime if it doesn't remove a copy from the original owner, there are multiple examples to the contrary.

The point is loss. You have to show you were damaged. In this case fry isn't losing anything.

He’s losing work and the effectiveness of his strike. Either they want his voice and they’d pay for it if he wasn’t striking, in which case his literal voice is working against his figurative one against his will, or they just need a voice and there was no fucking reason to steal a real person’s.

You have to prove that in court. And no he's not losing work cuz theres no one paying in the first place. Chatgpt didn't get a job over him. No one said oh we don't need him to do this voice-over we have ai.

Also Remember we are not talking about replicating his voice we are talking about training an AI with it. Technically different subjects.

He has to prove it in court if he wants accurate compensation, but that’s not really on the table atm.

Did you read the article? I’ll quote the relevant section.

During his speech at CogX Festival on Thursday, Fry played a clip to the audience of an AI system mimicking his voice to narrate a historical documentary.

“I said not one word of that—it was a machine. Yes, it shocked me,” he said. “They used my reading of the seven volumes of the Harry Potter books, and from that dataset an AI of my voice was created, and it made that new narration.”

They are replicating his voice.

And also digital reproduction like in the case of piracy doesn’t remove a copy from the author but that is also illegal and is also considered theft.

No, it is considered copyright violation. That's a crime too (well, often a civil tort) but it is not theft. It's a different crime.

If you want something to be illegal there needs to be an actual law making it illegal. There isn't one in the case of AI training because it isn't theft and it isn't copyright violation. This is a new thing and new things are not illegal by default.

Calling it "theft" is simply incorrect, and meaningfully so since it's an emotionally charged and prejudicial term.

You skipped the identity theft part because I guess it kinda takes all the wind out of your argument lol.

Even then, “Theft” isn’t a single unique crime or law that’s distinct from copyright infringement, it’s an umbrella term. What you’re thinking of as the crime of “theft” is “larceny”, which actually does refer to taking physical property specifically. But Stephen Fry didn’t use the term Larceny here.

Copyright infringement when dealing with the theft of intellectual property is a type of theft. And since the rights to your voice and or performance is a thing you can own, it can easily be considered theft. It doesn’t need a new law, it’s just a new way to commit an old crime.

You skipped the identity theft part because I guess it kinda takes all the wind out of your argument lol.

I skipped it because it's not related to what's going on here. "Identity theft" is fraud, not just impersonation. People impersonate other people with no problem, eg this Dolly Parton impersonation contest that was the first hit when I went googling for "look-alike contest". You could perhaps use AI voice emulation as part of an identity theft scheme, but the crime is in how it's used not in the emulation itself.

Copyright infringement when dealing with the theft of intellectual property is a type of theft.

No, it is emphatically not a type of theft. That's the fundamental point you keep missing here.

Judges have explicitly and specifically said that this is not the case. In Dowling v. United States the U.S. Supreme Court ruled that copyright infringement was not stealing. This is a legal matter, which is not subject to personal opinion - it's not theft. Full stop.

The fact remains that in the case of identity theft, it is not the case that the thing being stolen must be a singular finite thing that is removed from your possession, which directly contradicts your original statement, which your entire argument depends on. You claim that it isn’t theft because his voice is “still where he left it”. Well in the crime of identity theft your identity remains right where you left it. This is the point you keep missing.

As for the Dowling v. United States ruling, it’s not the case that the judge held that copyright infringement isn’t theft, you’ve misinterpreted it entirely. What was held was that “Copies of copyrighted works cannot be regarded as "stolen property" for the purposes of a prosecution under the National Stolen Property Act of 1934.”

That is a very narrow ruling that clarified the definition of stolen property only as it applies to potential prosecution over law unrelated to copyright infringement. Like I said, there are different types of theft, and this ruling simply solidified the difference between crimes of the nature of theft, and larceny.

So, do you have a ruling somewhere that states that copies of copyrighted works can be regarded as "stolen property" for some other purpose?

Why are there completely separate laws regarding theft of physical property and the violation of copyrights if they can be regarded as the same?

1 more...
1 more...
1 more...
1 more...
1 more...
1 more...
1 more...
1 more...
1 more...
1 more...
1 more...

What word or phrase would you have used in the headline ?

Copyright infringement, which, in this context, is still a seriously concerning crime.

It's not copyright infringement. You can't copyright a style, which is basically what a voice amounts to.

This is something new. It's a way of taking something that we always thought of as belonging to a person, and using it without their permission.

At the moment the closest thing is trademark infringement, assuming you could trademark your personal identity (which you can't). The harms are basically the same, deliberately passing off something cheap or dodgy as if it was associated with a particular entity. Doesn't matter if the entity is Stephen fry or Pepsi Max.

It is, as a matter of fact. When Fry recorded his voice for those audiobooks, they were copyrighted. Reproducing the contents of those works as they have is, arguably a violation of copyright.

And when you compare Steven Frye to Pepsi Max, that’s a false equivalence, because you’re comparing a copyrighted material to a trademarked brand which are two different things.

Still, to your point of theft, nobody is taking anything from anyone. They are using something without permission, and that still falls squarely as copyright infringement, not theft.

Reproducing the contents of those works as they have is

This did not occur.

17 more...
17 more...
17 more...
17 more...
17 more...
22 more...

This is the best summary I could come up with:


Among those warning about the technology’s potential to cause harm is British actor and author Stephen Fry, who told an audience at the CogX Festival in London on Thursday about his personal experience of having his identity digitally cloned without his permission.

Speaking at a news conference as the strike was announced, union president Fran Drescher said AI “poses an existential threat” to creative industries, and said actors needed protection from having “their identity and talent exploited without consent and pay.”

As AI technology has advanced, doctored footage of celebrities and world leaders—known as deepfakes—has been circulating with increasing frequency, prompting warnings from experts about artificial intelligence risks.

At a U.K. rally held in support of the SAG-AFTRA strike over the summer, Emmy-winning Succession star Brian Cox shared an anecdote about a friend in the industry who had been told “in no uncertain terms” that a studio would keep his image and do what they liked with it.

Oscar winner Matthew McConaughey told Salesforce CEO Marc Benioff during a panel event at this year’s Dreamforce conference that he had concerns about the rise of AI in Hollywood.

A spokesperson for the Alliance of Motion Picture and Television Producers (AMPTP), the entertainment industry’s official collective bargaining representative, was not available for comment when contacted by Fortune.


The original article contains 911 words, the summary contains 213 words. Saved 77%. I'm a bot and I'm open source!

Paywall.

And that archive is blocked at my work because it's in Russia.

Apologies. Do you know a better way I can link stuff behind paywalls.

Also, am I being a jerk when I link to Archive? I don't wanna spread shady links unknowingly.

5 more...
5 more...
5 more...

See, I'm pulling the smartest move right now: AI can't take your job if you use AI to take your own job first.

Besides, I think Hollywood is pretty behind on tech overall. The current state of the art voice generator quality is still pretty bad, it'll be a very long time before it can replace actors in quality (if ever): if you train the AI voice on audiobooks, the generated voice is going to sound like someone narrating an audiobook, which really doesn't sound natural for dialogues at all.

I think then the key point isn't to ban generative transformer based AI: once the tech out of its box, you can't exactly put it back in again. (heh) The real question to ask is, who should own this technology so that it does good and help people in the world, instead of being used to take away people's livelihood?

Wrong. The real question is why do we presuppose that the output of creatively driven individuals must generate profit for a capitalist economy to have sufficient value that those people be permitted the basic necessities of life? Frankly I suspect most of our most valuable contributors to culture are never given the opportunity to be bad enough long enough to develop into their potential.

This whole "oh no, AI is going to take away our liveihoods" notion fundamentally accepts the false notion that people are only deserving of a functional life so long as the primary activities of that life is ultimately to contribute towards increasing the wealth of a tiny percentage of individuals.

It's the same mistake that leads us to massively undersupport educators and carers and will have people freaking out about how they'll "earn a living" once robots are able to do everything we practically require to be done.

People are fundamentally entitled to a living. If someone is being denied one, then look at the system that causes that not the specifics of that particular flavour of how it's happening.

This is from a guy who advocates Linux as it is Open Source! The only violation here would be if another used that voice claiming it to be Fry. That would be fraud. Otherwise there is no issue.

Since it is paywalled I can only guess from the title.

I don't understand the problem. He was payed for reading books and now we all have his voice. What did he expect?

Is there an AI imitating his voice making money? Is it being represented with his name? If not, what would be the difference with some person imitating his voice, whould that be stealing too?

Basically I don't see any problem with me buying those books training local model and give it other books to read. That can not be illegal, right?

Giving it to other people mentioning his name would definitely be fraud. But stealing? I don't know.

Selling it to other people under other name... I don't see a problem.

But than we come to AI generated images and I do start thinking in that way. Thou if they can find someone that looks like him, and other person sounding like him... they are all good?

Agree. The AI hate is becoming absurd and irrational. People really being played by the "think of the poor artist" propaganda.

I haven't seen him in anything for ages, is being on strike a euphemism?

Not a lot on his bio since 2020...

https://en.wikipedia.org/wiki/Stephen_Fry

He can’t do much as an actor without a voice.

Don't worry, ""artists"" only complain about ai when open source ai gets released.

Get your head out of your ass. Their voices are their art and to replicate that is not only disturbing it’s morally wrong. Especially if you do so for profit.

It's only wrong when done for profit.

Otherwise you're just having their material as data for an algorithm and a personal use case.

I don't know what someone would use AI art for "personal use" aside from trying to make some sort of porn or something for themselves

Use the voices for a film project or machinima if you want, use the picture generation models to create wallpapers, it's not my fault you insist on being obtuse about this by pretending you can't figure out a use case that isn't based around making money.

A film project or a machinima that I won’t be posting online or sharing with anyone? AI art generally doesn’t look very good, so I wouldn’t want to stare at it all day only to notice the imperfections all over it. Idk about you but it seems like these models are designed specifically to avoid paying talented people for their work. Simple as that. If we didn’t have capitalism they would simply not exist

This website is all the same, just a bunch of luddites mad that technology and advancement is killing jobs. Combines took farming jobs, quickbooks took accounting jobs, AI will take data entry and artist's jobs.

The ONLY way to get off capitalism is to automate the economy via robotics and machine learning models, it's the only way we could ever achieve a stateless society economically.

If we didn’t have capitalism they would simply not exist

Put down the crack pipe, any society looking to create a more socialist -> communist economy has developed and leaned on automation to do so. Nothing about that is going to change.

I'm not against the concept of automation. I'm against the concept of stealing the copyrighted works of artists who rely on this work to survive. AI does this, and it does it pretty poorly.

Nobody complained about copyright when Microsoft had the only image ai in the game, only when the open source stable diffusion came out did they start screeching about how ai was "stealing their jobs".

Fuck off. The tech got popular and public got educated on what makes it work.

So years of Microsoft's advertising dalle did nothing to educate the public about how ai works but they're suddenly all experts the week after stable diffusion comes out?

No, they didn’t because I’ve literally never heard of it until your comment. And I understand that my experience is anecdotal, but I guarantee I’m not the only one, or even one of only a couple thousand. You severely overestimate how knowledgeable the general public is on AI. Most haven’t even heard of Chat GPT, and that’s in the news, let alone expecting everyone to be interested in it enough to actually educate themselves on it.

Like you’re the only person in this thread that’s even mentioned Microsoft’s version, yet you think “the public” knows about it?

Uh no people definitely did. Mostly the people that actually knew how this shit worked. But even laypeople complained when it was just Dall-E and Midjourney.

What are you talking about? When MS had the only image AI in the game, it was garbage and couldn't do anything useful. Of course no one was threatened.

But after researchers got their hands on nVidia 3000 series cards, and finally had access to hardware.

More advanced research papers started spilling out, which has caused this crazy leap in AI tech.

Now the image/audio AI is advanced enough to be useful, hence now the threats..

When MS had the only image AI in the game, it was garbage and couldn't do anything useful.

And yet it was still doing exactly the same thing that people are now going on about how "unethical" it is.

Just goes to show that they don't actually care how "unethical" it is until it actually poses a threat to their income. It's about money, not about principle.

This is such a ridiculous argument it’s not even funny. You have absolutely no evidence to back up your deranged claim. Take your victim complex somewhere else.

AI can very easily be abused and I don't see how this is related to the tech being open sourced or not. Fighting to ensure you aren't exploited is fine and I support anyone to fight against exploitation.

I don't get why so many people feel the need to defend big corporations this much. It's not like they're going to share the profits with the people who defend them, nor do they probably care.

If anything, the industry will just use whatever ammo they can to exploit more people.

Without maintaining and creating protections, they will roll back until there are almost none. Our current labor rights didn't come for free, they were fought for.

They downvoted him because he spoke the truth.

It's funny how all (or at least most of them) of the parents of those "artists" told them to do/learn something real and now they get their recipe for their bad choice.

I've discussed with someone about how pictures made by stable diffusion is not Art while there are literally "paintings" where the "artist" just jizzed on the canvas which then got declared as Art. I trolled him by sending him multiple generated anime pictures and asked him which is "Art" because he said he could recognize Art. He chose one and fell into the trap.

ChatGPT is why the public is scrambling about AI. AI art has been around awhile and there's always been complaining because its lame compared to real artists. This has fuck all to do with it suddenly being open source AI.