"Shame must change sides": a Belgian model warns about deepnudes

MicroWave@lemmy.world to News@lemmy.world – 182 points –
"Shame must change sides": a Belgian model warns about deepnudes
euronews.com

Julia, 21, has received fake nude photos of herself generated by artificial intelligence. The phenomenon is exploding.

"I'd already heard about deepfakes and deepnudes (...) but I wasn't really aware of it until it happened to me. It was a slightly anecdotal event that happened in other people's lives, but it wouldn't happen in mine", thought Julia, a 21-year-old Belgian marketing student and semi-professional model.

At the end of September 2023, she received an email from an anonymous author. Subject: "Realistic? "We wonder which photo would best resemble you", she reads.

Attached were five photos of her.

In the original content, posted on her social networks, Julia poses dressed. In front of her eyes are the same photos. Only this time, Julia is completely naked.

Julia has never posed naked. She never took these photos. The Belgian model realises that she has been the victim of a deepfake.

67

Is this different to how people would edit Britney Spears' face onto porn stars in the 90s?

It's much easier to do now. You should be able to do several in a single minute and the barrier to entry of using the software is way lower than Photoshop. Legally though, these seem indistinguishable.

They're easier to create and more realistic. The prevalence and magnitude of an immoral act impacts how it should be legislated. Personally I don't care if people make these and keep it to themselves, but as soon as you spread it I think it's immoral and harassment and there should be laws to prevent it.

It's like that, except deep

Probably should have sued those people too... People need to cut this shit out. You're fucking with others people's life's.

They're not going to. There is an insane amount of entitlement around people's jerk off material. Right here on Lemmy, I've seen someone (who denied being a child) call pornography a "human right" and groups of people insisting they should be able to openly trade images of child rape as long as they're AI generated.

Fuck you people who equate pornography with child porn. You know what you're doing, you sick bastards.

Pornography is not at all the same thing as child porn. Do not speak about them in the same way.

That sounds like an insane amount of entitlement from the one guy you found. Hopefully that entitles you to ignore everyone with even a fraction more nuance.

How dare I ignore the many subtle layers of nuance in "Using AI to create pornographic images of a woman and then sending them to her so she knows you've done it".

and groups of people insisting they should be able to openly trade images of child rape as long as they’re AI generated.

"Be able to" in what sense? Morally and ethically? No, absolutely not obviously. But what would the legal reason be to make it illegal since no actual children were involved? If I paint an explicit painting of a child being raped, is that illegal? I don't think it would be. It would certainly give people good reason to be suspicious of me, but would it be illegal? And would an AI-generated image really be different?

But what would the legal reason be to make it illegal since no actual children were involved

Prove it. Trawl through thousands and thousands of images and videos of child sexual assualt and tell me which ones were AI generated and which were not. Prove the AI hadn't been set up to produce CSAM matching a real child's likeness. Prove it won't normalize and promote the sexual assault of real children. Prove it wasn't trained on images and videos of real children being raped.

Legalising AI-generated child pornography is functionally identical to legalising all child pornography.

Legalizing or already legal? Because that's my question. I don't think it would be illegal, at least not in the U.S. I can't speak for other countries, but here, proving a negative in court isn't a thing.

I think porn generation (image, audio and video) will eventually be very realistic and very easy to make with only a few clicks and some well crafted prompts. Things would just be a whole other level that what Photoshop used to be.

I said it before, banning this doesn’t work. Legislation will always play catch up to the ever improving technologies, the best we can do is flood the internet with ai porn. Of everyone, as much as possible. To the point nobody even cares any more, because there is nudity of everyone. Normalize it before it ruins lives

or lean into it; ban clothing

Global warming hasn't made that an option around here yet. Give it 10 years.

You haven't seen some of us without clothes

not according to this AI generated image I just made of "everybody naked"

The other option, and the more likely one, is an extremely broad law that is intended to account for future technologies but will actually be used to further erode civil liberties.

Welcome to anarcho-naturism.

Whatever I search on Pinterest, Google, Bing, the images there nowadays are mostly just AI generated. I am so used to them by now, I just don't care anymore. Whatever makes me feel like it's cool, I praise it. Recently hyper realistic AI generated videos have been popping up, and once there's enough of datasets of free porn videos, which is most definitely coming out in a few years, the Porn industry is going to be filled with AI generated porn videos as well.

I think AI generated porn videos are going to be very realistic because there's so much free porn.

This is going to be a serious issue in the future - either society changes and these things are going to be accepted or these kind of generating ai models have to be banned. But that's still not going to be a "security" against it...

I also think we have to come up with digital watermarks that are easy to use...

Honestly, I see it as kinda freeing. Now people don't have to worry about nudes leaking any more, since you can just say they're fake. Somebody starts sending around deepfakes of me? OK, whatever, weirdo, it's not real.

I'm guessing it's easier to feel that way if your name is Justin.

If it was Justine, you might have issues.

Weird how that works.

Fair enough. Ideally it would be the same for women too, but we're not there as a society yet.

Such an empty response. Do you know that women have to do things on dates out of fear of being killed? Literally they have a rational fear of being killed by their male dates and it’s a commonly known and accepted fear that many women relate.

Society moving forward is a nice idea, women feeling safe is much better one and attitudes like yours are part of the reason women generally do not feel safe. Deepfakes are not freeing at all.

Literally they have a rational fear of being killed by their male dates and it’s a commonly known and accepted fear that many women relate.

No joke, stop dating shitty men.

I get that's too difficult for a lot of them, though.

How do you identify shitty men? They don't wear labels. In fact they do their best to hide their shitty behavior at first.

Learn from your experiences, the experiences of others, and try to be a better judge of character. Don't hang around bad crowds.

Unfortunately, most people can't rise above peer pressure and think a group is always correct even if it's comprised of shitty people.

What you don't do is keep doing the same thing and expecting different results.

Your responses are tone death as fuck.

The experiences of other woman is they thought they could trust certain men and they ended up dead. My experience is that men can massively change their behavior depending on the social setting. Just guys, guys and girls, or alone with a woman.

Besides sometimes its not even a bad crowd thing or judge of character. Sometimes some creep gets obsessed with a women and if she turns him down the "wrong" way, dude can flip and kill her.

There have been multiple INCEL killers. What did women do wrong that time, oh great wise one?

I encourage you to make women friends and actually listen to their experiences. It is not as simple as you make it out to be.

YOU ARE NOT ANGRY ENOUGH

is this the new "conform"? "Be angry"?

Poison the well is how we free ourselves from the vastness of the digital landscape that encompasses us. Make all data worthless.

I think there's a big difference between creating them and spreading them, and putting punishments on spreading nudes against someone's will, real or fake is a better 3rd option. The free speech implications of banning software that's capable of creating them is too broad and fuzzy, but I think that putting harsh penalties on spreading them on the grounds of harassment would be clear cut and effective. I didn't see a big difference in between spreading revenge porn and deep fakes and we already have laws against spreading revenge porn.

With ai and digital art... What is real? What is a person? What is a cartoon or a similar but not same likeness? In some cases what even is nudity? How old is an ai image? How can anything then be legal or illegal?

It's not a serious issue at all.

Of course, if you're the kind of greedy/lazy person who wants to make money off of pictures of their body, you're going to have to find a real job.

21 more...

I seriously don't get why society cares if there are photos of anyone's private parts.

I think we as a society are too uptight about nudity, but that doesn't mean that creating pictures of people without their consent, which make them feel uncomfortable, is in any way OK.

What about photos of politics in compromising situations? Should we have them without their consent?

I think the issue is that there is sexual imagery of the person being created and shared without that persons consent.

It's akin to taking nude photos of someone without their consent, or sharing nude photos with someone other than their intended audience.

Even if there were no stigma attached to nudes, that doesn't mean someone would their nudes to exist or be shared.

I’ll continue this conversation in good faith only after you’ve shown us yours to prove your position.

6 more...

Someone at TikTok has all the power to make nudes off every one in the planet except for 5 homeless guys from LA that you don't want a nude from anyway. Tiktok has the images of you (you idiot) and the hardware and software required to fake you to everyone you know.

Welcome to China 2.0!

No, they don't. Neither has Instagram, to my knowledge they have two, posted by other people. Now Grindr on the other hand...

Oh I would expect Grindr to call this a feature....

You liked Jeff, but Jer, for an extra $7.53 you can automatically see him naked to reveal the full package!

Pal, what the fuck are you talking about? TikTok and China are not mentioned anywhere in this article and nowhere on TikTok is there an option to generate anyone's likeness, clothed or unclothed.