Taylor Swift is living every woman’s AI porn nightmare — Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.

L4sBot@lemmy.worldmod to Technology@lemmy.world – 285 points –
Taylor Swift Is Living Every Woman’s AI Porn Nightmare
vice.com

Taylor Swift is living every woman’s AI porn nightmare — Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.::Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.

190

You are viewing a single comment

You just keep shifting your argument to create some sort of sympathy. I guess. No one says a rich person isn't a victim. The point is is being a victim as a wealthy and influential woman like Taylor is a lot different than being a victim in a working class context. If you disagree with that, then you're either being intellectually dishonest or living in a dream world.

Even the law agrees. It's a lot harder as a celebrity to win a defamation lawsuit than it is being a normal person. You typically have to show actual malice. Frankly, that's the legal standard that would probably apply to any lawsuit involving the deep fakes anyway.

The wealth of the victim doesn’t change the crime.

It's not a crime.

So, creating nude AI deepfakes isn’t a crime? Then there’s no victims at all. What’s everyone talking about then?

It can't be a crime unless there is a criminal statute that applies. See if you can find one thst applies.

So there’s no victims. Rich or poor. Why is this a problem?

Your response doesn't logically respond to my comment. It attempts to reframe the argument by setting up a "strawman," and shows that you fail to understand (or choosing to ignore because it doesn't support your new reframed argument) the difference between civil and criminal law in the United States.