Teen girls are being victimized by deepfake nudes. One family is pushing for more protections

MicroWave@lemmy.world to News@lemmy.world – 466 points –
Teen girls are being victimized by deepfake nudes. One family is pushing for more protections
apnews.com

A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

351

You are viewing a single comment

So as a grown woman, I'm not getting why teenage girls should give any of this oxygen. Some idiot takes my head and pastes it on porn. So what? That's more embarrassing for HIM than for me. How pathetic that these incels are so unable to have a relationship with an actual girl. Whatever, dudes. Any boy who does this should be laughed off campus. Girls need to take their power and use it collectively to shame and humiliate these guys.

I do think anyone who spreads these images should be prosecuted as a child pornographer and listed as a sex offender. Make an example out of a few and the rest won't dare to share it outside their sick incels club.

That's fine and well. Except they are videos, and it is very difficult to prove they aren't you. And the internet is forever.

This isn't like high school when you went to high school.

Agreed on your last paragraph.

Then nude leak scandals will quickly become a thing of the past, because now every nude video/picture can be assumed to be AI generated and are always fake until proven otherwise.

That's the silver lining of this entire ordeal.

Again, this is a content distribution problem more than an AI problem, the liability should be on those who willingly host these deepfake content than on AI image generators.

That would be great in a perfect world, but unfortunately public perception is significantly more important than facts when it comes to stuff like this. People accused of heinous crimes can and do lose friends, their jobs, and have their life ruined even if they prove that they are completely innocent

Plus, something I've already seen happen is someone says a nude is fake and are then told they have to prove that it's fake to get people to believe them... which is very hard without sharing an actual nude that has something unique about their body

The rest of the human body has more unique traits than the nude parts. Freckles, birthmarks, scars, tattoos. Those are traits that are not possible to replicate unless the person specifically knows.

Now that I think about it, we all proobably need a tattoo. That should clear anyone instantly.

You can ask an AI to draw a blurred version of the tattoo. Or to mask the tattooed area with, I don't know, piece of clothes or something.

Yes I'm sure a hiring manager is going to involve themselves that deeply in the pornographic video your face pops up in.

HR probably wouldn't even allow a conversation about it. That person just never gets called back.

And then the worse part is the jobs that DO hire you. Now you have to question why they are hiring you. Did they not see the fake porn video? Or did they see it.

The entire thing is damaging and ugly.

If you are already an employee, then they, will want to keep you and look into the matter.

If you are not an employee yet - is HR really looking up porn of everyone?

Yes, HR Googles your name. 🙄

I am pretty sure people who do porn use pseudonyms anyway. If HR thinks the people use their real name and spread their porn on the internet, they are dumb for not realizing it's fake. HR being HR as always.

Seems we're partially applying market dynamics of supply and demand. Simply assuming the "surplus" supply of deep fakes will decrease their value ignores the fact that the demand is still there. Instead what we get is new value opportunities in the arms race of validating and distributing deep fakes.

Why should they have to expend any energy proving it's not them?

I mean they obviously shouldn't have to, but if nude photos of you got leaked in your community, people would start judging you negatively, especially if you're a young woman. Also in these cases where they aren't adults it would be considered cp.

So they do it and share it around to slut shame you

You try to find a job and they find porn of you

It’s a lot worse than you’re making it out to be when it’s not you that gets to make that decision

IMO the days of searching for porn of prospective employees are over. With the advent of AI generated porn, what would be the point of that?

There are so many recent articles linked on Lemmy about people losing their job over making porn. The days of losing jobs over porn is now more than ever.

Seriously? Maybe we don't read the same stuff but that's not something I've noticed.

I just can't imagine how that's possible. I wish someone would fire me over porn so I could sue them for unfair dismissal as well as defamation and or libel.

You may not be representative of teenage girls.

So as a grown woman

Right? Literally not what's being discussed. Obviously they'll be more mature and reasonable about it. Teenagers won't be

I wasn't very representative even when I WAS a teenager. I was bullied quite a bit, though.

And can you imagine those bullies creating realistic porn of you and sharing it with everyone at school? You may have been strong enough to endure that - but it's pretty unrealistic to expect everyone to be able to do so. And it's not a moral failing if somebody is unable to. This is the sort of thing that leads to suicides.

What if the deep fake was so real it was hard to tell? Now if the deep fake was highly invasive and humiliating? Can you see the problem?

I think that the point this comment is trying to make is that because it has become so easy to make these images, their existence is not very meaningful. All deep fakes are very realistic. You can't tell fakes from originals.

Like as an adult, if I saw an "offensive" image of a co-worker, my first assumption would be that it's probably AI generated, my first thought would be "which asshole made this image" rather than "I can't believe my co-worker did [whatever thing]".

Not really. The more extreme it is, the more easily people will believe you when you say it's a deep fake. Everyone who matters (friends and family) will know it's not you. The more this sort of thing becomes commonplace, the more people will simply shake their heads and move on.

People kill themselves over much more mundane things than this. I think you overestimate teenagers unfortunately, not everyone can handle it as lightly as you would. Telling people to just "shake it off" will simply not work most of the time.

Sadly, you have a point. Somebody with good support at home and a circle of friends can weather this sort of thing, but others may feel helpless or hopeless. There needs to be an effective place to turn to for kids who are being bullied. Unfortunately that doesn't seem to exist.

That depends on a how a specific person is seen and treated by their surroundings.

A teenage girl who is already a victim of harassment or bullying for example will be treated very differently when humiliating images of her surface in her peer group, compared between someone who is well liked in school.

People who do this have to be judged much more harshly. This can't become the next item on a list of common sexual harassment experiences every girl and women "has to" experience.

3 more...