The double sexism of ChatGPT’s flirty “Her” voice

jeffw@lemmy.world to Technology@lemmy.world – 151 points –
The double sexism of ChatGPT’s flirty “Her” voice
vox.com
67

You are viewing a single comment

He tweeted "Her", which explicitly tells us it's a deliberate imitation of Scarlett's voice in that movie. And he tried to negotiate licencing her famous voice, which she rejected.

So it's more than just a coincidence, it's deliberate bad faith behaviour. Legally you can't misrepresent a product as being from a famous person when it wasn't, and he very much did that. I guess he was hoping she'd give in and accept the licensing agreement post-facto. But instead it looks he's in legal deep water now.

That's not proof of anything. It's the most well known reference to a "AI Girlfriend" in popular culture that there currently is.

This reminds me of when the Fine Brothers tried to trademark the word "react" or when Paris Hilton did the same thing for "That's hot".

Celebrities get wide latitude to protect themselves from imitators. Impressionists can do "satire" etc. but this isn't that. It's explicitly a reference to her voice in the movie, and as such she's protected by law from them going around her and hiring someone else to imitate her.

Well maybe Scarlet needs to start pay royalties to millions of Midwestern women. Because she didn't come up with that way of talking on her own now did she?

This argument is so stupid it's even remarkably stupider than the surrounding comments in a lemmy thread full of braindead bot humpers.

Congrats! 🎈

Alternate theory, they heard what they came up with, tried to liscence the use of her voice to avoid a legal fight, hoped she might come around after the fact and now here we are.

Seems possible anyway.

Do we know Sam had any specific and previous interest in Scarlet?