Creating sexually explicit deepfakes to become a criminal offence

MicroWave@lemmy.world to News@lemmy.world – 119 points –
Creating sexually explicit deepfakes to become a criminal offence
bbc.com

The creation of sexually explicit "deepfake" images is to be made a criminal offence in England and Wales under a new law, the government says.

Under the legislation, anyone making explicit images of an adult without their consent will face a criminal record and unlimited fine.

It will apply regardless of whether the creator of an image intended to share it, the Ministry of Justice (MoJ) said.

And if the image is then shared more widely, they could face jail. 

A deepfake is an image or video that has been digitally altered with the help of Artificial Intelligence (AI) to replace the face of one person with the face of another.

41

She said it "will only criminalise where you can prove a person created the image with the intention to cause distress", and this could create loopholes in the law.

This sounds like it would be almost impossible to prove. Anyone can just say "I made the deepfake porn because it turns me on, and I shared it online because I thought other people would enjoy it too." This law is a good start but I don't think it goes far enough.

This is a tricky one - it's a law that's setting out to protect folks, but it's also laying some dangerous groundwork.

Can you explain more on that? This sounds perfectly reasonable to me.

So, the basis of this law is a person's facial/physical parameters were used to make vulgar videos/images. This caused them distress, and so this is illegal. At this point, it is aiming to protect people from something deemed traumatic, and the law specifically requires the intent to be "to distress". It's a good law.

Let's say someone hand draws a vulgar image using my physical parameters that I find distressing. Is that illegal? At the moment, no, however it's not too great a step to see it pushed through by a similar argument (what if the artist was REALLY good, and they intended to distress me). And from here we go...

What about a cartoon or caricature? Could someone draw an image of the UK PM performing oral sex to billionaires and fall afoul if the subjects find it distressing? Surely such a cartoon cannot help but intend to dismay or distress the subjects?

Does it have to be vulgar or just cause distress? Could they just mock a person using an image? Mockery certainly sums to distress?

Does it have to be an image, or are distressing written pieces also viable?

It shifts towards the idea that "artistic" creations that provoke distress to an individual ought to be illegal. This is a viewpoint I stand strongly against.

However, this law is groundwork. Groundwork can also push towards a lot of good. Then again, how much do you trust UK politicians to make informed internet laws?

You make good points. I think applying the law differently based on whether the victim was a private person or a public figure (much in the way libel/slander laws do - at least in the USA) would be a good measure to ensure that free speech is preserved, while innocent people are protected.

Might not be so easy in the UK, I can't say I know a lot about the application of the law in this regard. Though given the Horizon scandal, I definitely lean towards not trusting UK politicians to make informed laws regulating computers...

That may be a good compromise, though it does provoke the question as to how public a figure one must be before one waives the right to not have malice directed at them.

I think my main qualm is that well meaning laws can often set a precedent for unforseen issues down the road, and the curtailment of any liberty, no matter how vile that liberty is, must be done with care to avoid creating traps for the future. Something UK politicians are famous for failing to do (our digital safety laws are very much responsible for our lack of privacy, and have created some of the most dangerous data troves on the planet!).

It just feels like the wrong approach to me.

Who is to say an image depicts whatever person. Sure a court / jury can try to decide, but I suspect that it's possible to make deepfakes look enough like the victim to do the damage, while dissimilar enough that there's reasonable doubt that the depiction was intended to represent the victim.

I feel like society needs to adapt to this. To me, in the age of deepfakes, there shouldn't be any shame in being depicted in a video, rather the people sharing it and watching it.

Like if someone said they saw a naughty video of me, I don't really see why I should be ashamed, but I world ask them what the hell they're doing watching it.

It's not about shame.. like wtf? People's careers can be affected. In a day and age where corporations request access to all your social media and do extensive background/social media checks it's likely they may stumble upon a deep fake porn video of you and refuse to hire you over it.

Teachers are being fired for having only fans, what do you think is gonna happen when deep fake porn videos of teachers start to pop up and parents start demanding the teacher be fired?

Ok, semantics. "Shame" may not be the best word for it but it's the best I can think of.

Not as in "I feel ashamed about this video" but rather people thinking you ought to feel ashamed.

In a day and age where [...]

In a soon to be day and age where you can tell your VR goggles "show me a video of that person at the library today", we shouldn't be judging that person at the library.

For anyone interested in the science, here's a video from Harvard psychiatrist/instructor Dr. K from HealthyGamerGG going over research on non-consensual pornography (deepfakes, revenge porn, and the like). The most impactful point he makes is, "After non-consensual pornography is released over half of the people who are involved want to kill themselves". There's a few minutes more of content for those who want to watch, including talking about increased rates of depression, anxiety, PTSD, etc., especially when the porn goes public.

I'll admit I used to look at deepfake porn, but I stopped immediately after hearing that and the rest of that video. Not because religion or society disapproves but because I want my entertainment to be consensual and fun - not so destructive that it makes the unwilling participants suicidal more often than not. I think there are ways to do porn properly that involve worker protections, respect, and of course consent. So I think this statute has a pretty solid foundation in severe harm reduction as per the research.

I think it's harmless to generate and enjoy deepfake porn for your private consumption. I don't do it because I like my nude women to have ten fingers and toes.

I'm not sure I buy that argument.

There is enough pornography to last you until the heat death of the universe. The only reason you would create your own pornography with people who never consented is because the transgression itself is what is giving the sexual pleasure, not the pornography.

Enjoying something for its transgressive nature doesn't make something necessarily bad. Neither does that make everything okay, I'm just saying there is nothing inherently wrong with enjoying the transgressiveness of an act more than the act itself.

But this is highly philosophical and any attempt to seriously explain my thoughts (and I made three attempts before just going with this) quickly gets too long, so hopefully we can just agree that there's room for respectful disagreement.

No, I don't think there is room for respectful disagreement when one of us is arguing they don't require sexual consent.

No one consents to featuring in private fantasies.

A private fantasy is a thought. Creating unconsensual pornography is an action.

The longer this conversation continues, the higher your ick factor gets.

Masturbating is an action, too. What's the qualitative difference between a private image in one's head and one on a screen?

You seem to struggle with separating a philosophical conversation from whatever icks you. If it assuages your sensibilities I don't engage in this activity, but only because it's not arousing to me in any way. I just can't see any actual difference between a fake image in your head and an equally private image on a screen.

When an image becomes public or harassing I think there is a clear problem. If you want to disagree on where the line is between an image in your head and one used to harm someone, that's fine. Neither of us is drafting a law here so it's all just wind and words. Have a good day.

An image in your head isn't a physical product that exists in the material world.

7 more...
7 more...
7 more...
7 more...
7 more...
7 more...

If it's a deepfake of a celeb you like, you may likely never get the real thing

7 more...
7 more...
7 more...

I'm not a legal expert, but it seems like creating nonconsensual sexual depictions should be illegal already?

People make art/cartoons of people/characters for personal use all the time which often leads to NSFW. I don't know, as an American that seems like the kind of thing you should be free to do. But the harassment angle is undeniable.

Making art and having it are quite inalienable rights, but transmitting it online is when things get iffy... And if you had to transmit it to make it, as with AI... I think I could talk myself into agreeing with the need to pass a law on that basis

You can generate deepfake nudes on a modest home gaming PC. No transmission required.

9 more...