Teen girls are being victimized by deepfake nudes. One family is pushing for more protections

MicroWave@lemmy.world to News@lemmy.world – 466 points –
Teen girls are being victimized by deepfake nudes. One family is pushing for more protections
apnews.com

A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

351

I don't know what a reasonable"protection" looks like here: the only thing foresee is 14 year old boys getting felonies, but no one being protected.

Right, there are plenty of reactive measures available but the only proactive measures are either restricting availability of the source photos used or restricting use of the deep fake tools used. Everything beyond that is trying to put the genie back in the bottle.

At some point, communities and social circles need to be able to moderate themselves.

Disseminating nudes of peers should be grounds for ostracizing, but it really depends on the quality of people around you.

That doesn't work. It's nothing but an inconvenience to not talk to your neighbors or those around you. They'd just get even worse and make even worse friends online.

Ostracization doesn't work. Ever. Period. If they're bad enough, banishment works. Ostracization is just literally ignoring the problem.

7 more...
7 more...

It's not possible to restrict deep fake technology at this point. It's out there. Accessible to everyone who wants it and has a computer at home.

Both of those options seem impossible in practice.

And that's the point I was making, nobody can be "protected" from widely available photos being used on widely available programs. Best we can do is deter but that isn't a guarantee.

9 more...

Even if you don't want to consider it CSAM, it is, at the very least, sexual harassment. The kids making and circulating these pictures and videos should be facing consequences. And the fear of consequences does offer some degree of protection at least.

It looks like pretty severe sexual harassment at best. Unfortunately the people I think are most likely to do it are teenagers with poor self control who don't realize the severity.

I think if schools can implement appropriate restorative responses and education on the harm done that could be much more effective than decaigan punishments after the fact.

Should a teenager face consequences for drawing a picture of their classmate naked? What if they do it well? How is this at all different?

If they distribute the drawing, yes. And the difference is that a drawing is immediately recognisable as a drawing, but an AI generated image or video isn't necessarily easily recognisable as not being real, so the social consequences for the person depicted can be much worse.

11 more...

Methinks this problem is gonna get out of fucking hand. Welcome to the future, it sucks.

AI is out of the bag for all the good and bad it will do. Nothing will be safe on the internet, and hasn't been for a long time now. Either we will get government monitored AI results or use AI to combat misuse of AI. Either way isn't preventative. The next wild west frontier is upon us, and it's full of bandits in hiding.

Maybe it is just me, but its why I think this is a bigger issue than just Hollywood.

The rights to famous people's "images" are bought and sold all the time.

I would argue that the entire concept should be made illegal. Others can only use your image with your explicit permission and your image cannot be "owned" by anyone but yourself.

The fact that making a law like this isn't a priority means this will get worse because we already have a society and laws that don't respect our rights to control of our own image.

A law like this would also remove all the questions about youth and sex and instead make it a case of misuse of someone else's image. In this case it could even be considered defamation for altering the image to make it seem like it was real. They defamed her by making it seem like she took nude photos of herself to spread around.

There are genuine reasons not to give people sole authority over their image though. "Oh that's a picture of me genuinely doing something bad, you can't publish that!"

Like, we still need to be able to have a public conversation about (especially political) public figures and their actions as photographed

Seems like a typical copyright issue. The copyright owner has a monopoly on the intellectual property, but there are (genuine reasons) fair use exceptions (for journalism, satire, academic, backup, etc.)

Reminder that the stated reason for copyrights to exist say all, per the US Constitution, is “To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries.”

Anything that occurs naturally falls outside the original rationale. We've experienced a huge expansion of the concept of intellectual property since then, but as far as I can tell there has never been a consensus on what purpose intellectual property rights are supposed to serve beyond the original conception.

Makes sense. If I do something worth taking a picture of that means I have zero rights to it since that is "natural", but the person who took the photo has all the rights to it.

Tell me this crap wasn't written for and by the worst garbage publishers out there.

Yeah I'm not stipulating a law where you can't be held accountable for actions. Any actions you take as an individual are things you do that impact your image, of which you are in control. People using photographic evidence to prove you have done them is not a misuse of your image.

Making fake images whole cloth is.

The question of whether this technology will make such evidence untrustworthy is another conversation that sadly I don't have enough time for right this moment.

10 more...

That sounds pretty dystopian to me. Wouldn't that make filming in public basically illegal?

In Germany it is illegal to make photos or videos of people who are identifieable (faces are seen or closeups) without asking for permission first. With exception of public events, as long as you do not focus on individuals. It doesn't feel dystopian at all, to be honest. I'd rather have it that way than ending up on someone's stupid vlog or whatever.

Wait you thought this was a problem for Hollywood?

It is for actors, since you would be handing over the right to your likeness to studios for AI to reproduce for eternity.

It was one of the main issues for the SAG-AFTRA strike.

Well at least they're getting paid for it. But someone could copy your likeness for free

The tools used to make these images can largely be ignored, as can the vast majority of what AI creates of people. Fake nudes and photos have been possible for a long time now. The biggest way we deal with them is to go after large distributors of that content.

When it comes to younger people, the penalty should be pretty heavy for doing this. But it’s the same as distributing real images of people. Photos that you don’t own. I don’t see how this is any different or how we treat it any differently than that.

I agree with your defamation point. People in general and even young people should be able to go after bullies or these image distributors for damages.

I think this is a giant mess that is going to upturn a lot of what we think about society but the answer isn’t to ban the tools or to make it illegal to use the tools however you want. The solution is the same as the ones we’ve created, just with more sensitivity.

15 more...

Maybe I'm just naive of how many protections we're actually granted but shouldn't this already fall under CP/CSAM legislation in nearly every country?

Would it? How do they prove the age of an AI generated image?

By.... checking the age of the person depicted in the image?

...who by definition is AI generated and does not, in fact, exist?

What? But they literally do exist, and they're hurting from it. Did you even read the post?

While you're correct, many of these generators are retaining the source image and only generating masked sections, so the person in the image is still themselves with effectively photoshopped nudity, which would still qualify as child pornography. That is an interesting point that you make though

The article is about real children being used as the basis for AI-generated porn. This isn't about entirely fabricated images.

12 more...

Just ask ChatGPT to cut them in half and count the rings.

23 more...

Won't somebody think of the make believe computer generated cartoon children?!

Someone has to pay... this image is only 2 hours old....TWO HOURS OLD, YOU ANIMALS

23 more...

In Germany, it would.

Not a lawyer, but 99% sure it's the same here in Canada as well.

Australia too. Hentai showing underage people is illegal here. From my understanding it's all a little grey depending on the state and whether the laws are enforced, but if it's about victimisation the law will be pretty clear.

Absolutely absurd. Criminalizing drawings is the stupidest thing in the world.

This case should already be illegal under harassment or similar laws. There's no reason to make drawings illegal

In germany even a written story about it is illegal. it is considered "textual CSAM" then.

Nah dude, I am perfectly cool with animated depictions of child sexual exploitation being in the same category as regualr child exploitation regardless of the fact that she's actually a 10,000 old midget elf or whatever paper thin explanation they provide not to be considered paedos.

Well that's just absurd and you should rethink your position using logic rather than emotion.

"that's just absurd"

Well that's an emotional response that includes no specifics or appeals to logic.

"rethink your position using logic rather than emotion"

Lol.

Well that’s an emotional response that includes no specifics or appeals to logic.

You clearly have no logic so I'm not going to appeal to it. Just a general comment.

I am shocked to my core that you would write off an observation as having no logic. Shocked I tell you!

I am perfectly cool with animated depictions of child sexual exploitation being in the same category as regualr child exploitation regardless

Not an observation. You're saying you don't care about actual victims, children actually being abused. Real, live victims. That's not worse than someone drawing some pictures you don't like?

Wow, you got me. That's totally what I am saying that you had to fabricate a quote that represents the argument you want to have.

2 more...
2 more...
2 more...
2 more...
2 more...
2 more...
2 more...
2 more...
2 more...
2 more...
2 more...
25 more...

Honest opinion:

We should normalize nudity.

That's the only healthy relationship that we can have with our bodies in the long term.

There's a pretty big fucking difference between normalizing nudity and people putting the faces of 14 year olds in porn video through deepfakes.

2 more...

This isn't even the problem going on, though? Sure, normalize nudity, whatever, that doesn't fix deep faked porn of literal children.

It's the idea that people may be less sexually depraved if we normalize people being naked. Duh. In my opinion the whole world just needs to normalize sexuality across the board. I think it would solve many of society's problems, frankly

Why is that a problem though? Youre allowed to draw a picture of a specific child naked, why is it suddenly a crime if you use a computer to do it really well?

1 more...

Having spent many years in both the US and multiple European countries, I can confidently say that the US has the weirdest, most unnatural, and most unhealthy relationship with nudity.

For this to happen people would probably need to stop judging people on their bodies. I am pretty sure there is a connection there. With how extremely superficial media and many relationships are, and with how we value women in particular, this needs a lot of change in people and society.

I also think it would be a good thing, but we still have to do something about it until we reach that point.

5 more...

There might be an upside to all this, though maybe not for these girls: with enough of this people will eventually just stop believing any nude pictures "leaked" are real, which will be a great thing for people who had real nude pictures leaked - which, once on the Internet, are pretty hard to stop spreading - because other people will just presume they're deepfakes.

Mind you, it would be a lot better if people in general culturally evolved beyond being preachy monkeys who pass judgment on others because they've been photographed in their birthday-suit, but that's clearly asking too much so I guess simply people assuming all such things are deepfakes until proven otherwise is at least better than the status quo.

Photoshop is a 40yo tool and people still believe almost every picture.

Yes, but good Photoshop has a high skill ceiling. Generative AI does not.

In previous generations the kid making fake porn of their classmates was not a well liked kid. Is that reversed now? On the basis of quality of tech?

That kid that doodles is creepy. But deep fakes probably feel a lot closer to actual nudes.

3 more...

So as a grown woman, I'm not getting why teenage girls should give any of this oxygen. Some idiot takes my head and pastes it on porn. So what? That's more embarrassing for HIM than for me. How pathetic that these incels are so unable to have a relationship with an actual girl. Whatever, dudes. Any boy who does this should be laughed off campus. Girls need to take their power and use it collectively to shame and humiliate these guys.

I do think anyone who spreads these images should be prosecuted as a child pornographer and listed as a sex offender. Make an example out of a few and the rest won't dare to share it outside their sick incels club.

That's fine and well. Except they are videos, and it is very difficult to prove they aren't you. And the internet is forever.

This isn't like high school when you went to high school.

Agreed on your last paragraph.

Then nude leak scandals will quickly become a thing of the past, because now every nude video/picture can be assumed to be AI generated and are always fake until proven otherwise.

That's the silver lining of this entire ordeal.

Again, this is a content distribution problem more than an AI problem, the liability should be on those who willingly host these deepfake content than on AI image generators.

That would be great in a perfect world, but unfortunately public perception is significantly more important than facts when it comes to stuff like this. People accused of heinous crimes can and do lose friends, their jobs, and have their life ruined even if they prove that they are completely innocent

Plus, something I've already seen happen is someone says a nude is fake and are then told they have to prove that it's fake to get people to believe them... which is very hard without sharing an actual nude that has something unique about their body

The rest of the human body has more unique traits than the nude parts. Freckles, birthmarks, scars, tattoos. Those are traits that are not possible to replicate unless the person specifically knows.

Now that I think about it, we all proobably need a tattoo. That should clear anyone instantly.

You can ask an AI to draw a blurred version of the tattoo. Or to mask the tattooed area with, I don't know, piece of clothes or something.

Yes I'm sure a hiring manager is going to involve themselves that deeply in the pornographic video your face pops up in.

HR probably wouldn't even allow a conversation about it. That person just never gets called back.

And then the worse part is the jobs that DO hire you. Now you have to question why they are hiring you. Did they not see the fake porn video? Or did they see it.

The entire thing is damaging and ugly.

If you are already an employee, then they, will want to keep you and look into the matter.

If you are not an employee yet - is HR really looking up porn of everyone?

Yes, HR Googles your name. 🙄

I am pretty sure people who do porn use pseudonyms anyway. If HR thinks the people use their real name and spread their porn on the internet, they are dumb for not realizing it's fake. HR being HR as always.

Seems we're partially applying market dynamics of supply and demand. Simply assuming the "surplus" supply of deep fakes will decrease their value ignores the fact that the demand is still there. Instead what we get is new value opportunities in the arms race of validating and distributing deep fakes.

Why should they have to expend any energy proving it's not them?

I mean they obviously shouldn't have to, but if nude photos of you got leaked in your community, people would start judging you negatively, especially if you're a young woman. Also in these cases where they aren't adults it would be considered cp.

So they do it and share it around to slut shame you

You try to find a job and they find porn of you

It’s a lot worse than you’re making it out to be when it’s not you that gets to make that decision

IMO the days of searching for porn of prospective employees are over. With the advent of AI generated porn, what would be the point of that?

There are so many recent articles linked on Lemmy about people losing their job over making porn. The days of losing jobs over porn is now more than ever.

Seriously? Maybe we don't read the same stuff but that's not something I've noticed.

I just can't imagine how that's possible. I wish someone would fire me over porn so I could sue them for unfair dismissal as well as defamation and or libel.

You may not be representative of teenage girls.

So as a grown woman

Right? Literally not what's being discussed. Obviously they'll be more mature and reasonable about it. Teenagers won't be

I wasn't very representative even when I WAS a teenager. I was bullied quite a bit, though.

And can you imagine those bullies creating realistic porn of you and sharing it with everyone at school? You may have been strong enough to endure that - but it's pretty unrealistic to expect everyone to be able to do so. And it's not a moral failing if somebody is unable to. This is the sort of thing that leads to suicides.

What if the deep fake was so real it was hard to tell? Now if the deep fake was highly invasive and humiliating? Can you see the problem?

I think that the point this comment is trying to make is that because it has become so easy to make these images, their existence is not very meaningful. All deep fakes are very realistic. You can't tell fakes from originals.

Like as an adult, if I saw an "offensive" image of a co-worker, my first assumption would be that it's probably AI generated, my first thought would be "which asshole made this image" rather than "I can't believe my co-worker did [whatever thing]".

4 more...
7 more...

I wonder what the prevalence of this kind of behavior is like in countries that aren’t so weird about sex.

This has nothing to do with "being weird about sex" and everything to do with men treating women poorly.

You can expect this to be worse in nations where women don't have as many rights and/or where misogyny is accepted as part of life.

Sounds plausible, we just abolished Roe, so…. It’s not looking great for the future of this issue in the US.

Kids don't know or understand the damage this can cause someone.

They see it as a joke most of the time.

It needs to be made illegal and the kids properly educated about why.

It's easy as an adult to condemn these children but we have a lot more life experience.

So, you’re saying both:

  1. It’s childish behavior
  2. It should be made illegal

So… you think the solution to childish behavior is putting kids in jail?

*deep breath* lemme try to see a more logical interpretation….

Wait, you did mention education, ok I musta missed that on my first read.

So educate the kids, and if they don’t learn… jail

Didn't say jail, you did. I in fact didn't talk about punishment at all.

But there has to be consequences.

If kids steal we don't just throw them straight in jail. But it is a possible consequence.

We're also talking about 14 year olds not literal children.

14 year olds are absolutely literally children

What's with remnants of reddit and pretending teenagers are kids? They aren't, they are teens, they can even make babies with themselves, drive and vote.

Illegal necessarily implies punishment, as far as I understand.

Also, 14 year olds are children. But the trajectory of this conversation is clear, and it’s not going anywhere.

Well that's the result when you put words in peoples' mouths, instead of trying to have a discussion.

7 more...
7 more...

What sort of "consequences" are you talking about that aren't punishment?

7 more...
19 more...

the kids properly educated about why.

https://dare.org/

  • DARE is celebrating its 40th anniversary.
  • It has officer-led classroom lessons that reach 2,500,000 K-12 students per year.
  • "Enriching students across the US and 29+ countries around the world"

If your argument is "The educators just need to make sure the kids learn that this is not a joke", DARE has been educating students about the dangers of illegal drugs for 40 years.

Overdoses claimed more than 112,000 American lives from May 2022 to May 2023, according to the Centers for Disease Control and Prevention, a 37 percent increase compared with the 12-month period ending in May 2020.

https://www.pbs.org/newshour/health/how-dozens-of-u-s-adolescents-are-dying-of-drug-overdoses-each-month-shown-in-3-charts

You might persuade some, but the problem will not go away.

DARE is known as a bad program, because it goes for fear mongering rather actual education. Everyone knows someone who uses marijuana, and they're teeth haven't all fallen out and they're haven't turned into a psychotic murderer. [VOX - Why anti-drug campaigns like DARE fail

](https://www.vox.com/2014/9/1/5998571/why-anti-drug-campaigns-like-dare-fail)

There are good and bad ways to go about education. Like comprehensive sex education vs abstinence only, even though they're covering the same topic, actual education is much more effective than just say no. [NLM Abstinence-only and comprehensive sex education study

](https://pubmed.ncbi.nlm.nih.gov/18346659/)

That was my point. DARE didn't stop drug use. Any education will persuade some. However, unless the students and their families buy in at 100%, this problem isn't going away.

About 130 million adults in the U.S. have low literacy skills according to a Gallup analysis of data from the U.S. Department of Education. This means more than half of Americans between the ages of 16 and 74 (54%) read below the equivalent of a sixth-grade level.

https://www.apmresearchlab.org/10x-adult-literacy

The starkest differences were seen by education group. Returning to the first question given above, in many countries adults with a "low" level of education (the equivalent of completing secondary school) had less than a 50% chance of getting the question correct. In places like Canada and United States, this fell to as low as 25%.

https://phys.org/news/2018-03-high-adults-unable-basic-mathematical.html

Education alone is not going to make this go away.

I 100% agree that education alone will not resolve the issue, but I believe education can help the efficacy of other approaches.

I did a report on the dangers of LSD when I was young.

I learned it’s impossible to overdose on and nobody has died directly as a result of it.

I had never been so interested in trying something out. “Okay so the world becomes crazy for 4-8 hours and you see crazy stuff and everything is hilarious and you can’t die at all and all you gotta do it be in a comfy set and setting”

God damn, Imma clean out a vial and watch Enter the Void

Quick edit: staring at my MacBook Pro turned into fractals and it’s just fucking anodized aliminium wtf that’s cool

You're using DARE as a positive example‽ the DARE program is widely considered to be an enormous failure. Here's a decent rundown:

https://www.talkitoutnc.org/dare-program-effectiveness/#:~:text=program%20failed%20to%20live%20up,rate%20of%20teen%20drug%20use.

(But if you just search it up you'll find hundreds of similar articles)

I was in school when the DARE program was quite strongly promoted and I specifically remember being fed endless misinformation about drugs. It was never about educating children it was about trying to scare them with bullshit.

"If they were wrong about marijuana being addicting they're probably wrong about everything else..."

...aaaaand that's how young people ended up trying all sorts of new things they shouldn't have.

Other people seem to think you're holding up Dare as a positive example. I can tell you're not, but I don't think it's a great negative example either. So much of the content is fear mongering bullshit that anyone who actually encounters drugs in real life will see through it.

Education works a lot better when you teach kids things that aren't directly contradicted by their experiences or their peers'.

I think at some point kids need to learn that there won't be someone stopping them from doing bad things.

They need to suffer the consequences of their actions through social rejection. If the microcosm is so shitty that it doesn't ostracize people who disseminate nudes, then the people in it deserve to suffer until they improve.

This should be one of the easiest ways to identify shitbags, but I understand a lot of social hierarchies put shitbags at or near the top.

19 more...

What does this have to do with the other? Where I live nudity isn't all that uncommon (when compared to the US, for example). But sexually harassing someone with fake porn is whole different issue.

I see a lot of problems with people having trouble understanding consent and struggling to respect other people. Those boys are weird about sex. That's the weirdness we should address.

My bad, I wasn’t as clear as I could have been. I meant, I wonder if boys would be so weird as to want to make such fake porn in places that are less weird about sex.

Did you think I was advocating for the fake images?

Naked pictures are all over the internet and they still wanted to make porn of their classmates. It's not about wanting the porn, it's about wanting to burt the girls.

I first thought it was more about the boys having a shitty concept, or no concept at all of consent… but that it was ultimately a horrible expression of interest in the girl. But no doubt both are possibilities. And neither is okay.

No I thought you meant that being hurt by fake porn about yourself is "being weird about sex".

3 more...
3 more...
3 more...

Boys and men are pretty similar the world over. Some are always going to be creeps who do shit like this, it doesn't matter what culture they're in.

6 more...
28 more...

What's the fundamental difference between a deep fake and a good Photoshop and why do we need more laws to regulate that?

Lower skill ceiling. One option can be done by pretty much anyone at a high volume output, the other would require a lot training and are not available for your average basement dweller.

Good luck trying to regulate it though, Pandora's box is opened and you won't be able to stop the FOSS community from working on the tech.

2 more...
3 more...

The problem is how to actually prevent this. What could one do? Make AI systems illegal? Make graphics tools illegal? Make the Internet illegal? Make computers illegal?

Make "producing real or simulated CSAM illegal?"

Isn't it already? Has it provided any sort of protection? Many things in this world are illegal, and nobody cares.

Yes, I would argue that if CSAM was legal, there would be more of it...meaning it being illegal provides a level of protection.

I wonder why are you being downvoted, something being illegal puts fear in most people to not do it.

I've been wondering about this lately, but I'm not sure how much of an effect this has. There are millions of people in prison, and many of those will go on to offend again. Making things illegal can be seen as an agreement to a social contract (in a democracy), drive the activity underground (probably good thing in many cases), and prevent businesses (legal entities) from engaging in the activity; but I'm not sure how well it works on an individual level of deterrence. Like, if there were no laws, I can not really think of a law I would break that I wouldn't already break regardless. I guess I'd just be more open about it.

Though, people who cause harm to others should be removed from society, and ideally, quickly rehabilitated, and released back into society as a productive member.

1 more...

It is where I'm at. Draw Lisa Simpson nude and you get a visit from the law. Dunno what the punishment is though. A fine? Jail? Can't say.

Edit: Apparently I was wrong, it has to be a realistic drawing. See here: 2010/0064/COD doc.nr 10335/1/10 REV 1

What about making depictions of other crimes? Should depictions of theft be illegal? Depictions of murder?

Why should depictions of one crime be made illegal, but depictions of other heinous crimes remain legal?

Because a picture of someone robbing my house doesn't revictimize me. Even if it's simulated, every time they run into some rando who recognizes them or every time a potential employer runs a background/social media check, it impacts the victim again

A picture of a cartoon child having sex doesn't victimize you either, the same way a drawing of a robbery doesn't victimize you

You mean being raped. What it does is let pedos feel like it's OK to be pedos.

Lol just like violent video games makes people think it's ok to be violent in real life?

Who is being victimized with a drawing of Lisa Simpson?

1 more...
21 more...

I studied Computer Science so I know that the only way to teach an AI agent to stop drawing naked girls is to... give it pictures of naked girls so it can learn what not to draw :(

hmmm - I wonder it makes sense to use generative AI to create negative training data for things like CP. That would essentially be a victimless way to train the AIs. Of course, that creates the conundrum of who actually verifies the AI-generated training data...

this doesn't work. AI still needs to know what is CP in order to create CP for negative use. So you need to first feed it with CP. Recent example of how OpenAI was labelling "bad text"

The premise was simple: feed an AI with labeled examples of violence, hate speech, and sexual abuse, and that tool could learn to detect those forms of toxicity in the wild. That detector would be built into ChatGPT to check whether it was echoing the toxicity of its training data, and filter it out before it ever reached the user. It could also help scrub toxic text from the training datasets of future AI models.

To get those labels, OpenAI sent tens of thousands of snippets of text to an outsourcing firm in Kenya, beginning in November 2021. Much of that text appeared to have been pulled from the darkest recesses of the internet. Some of it described situations in graphic detail like child sexual abuse, bestiality, murder, suicide, torture, self harm, and incest.

source: https://time.com/6247678/openai-chatgpt-kenya-workers/

President Joe Biden signed an executive order in October that, among other things, called for barring the use of generative AI to produce child sexual abuse material or non-consensual “intimate imagery of real individuals.” The order also directs the federal government to issue guidance to label and watermark AI-generated content to help differentiate between authentic and material made by software.

Step in the right direction, I guess.

How is the government going to be able to differentiate authentic images/videos from AI generated ones? Some of the AI images are getting super realistic, to the point where it's difficult for human eyes to tell the difference.

That's a cool quiz, and it's from 2022. I'm sure AI has improved since then. Would love to see an updated version.

I wouldn't call this a step in the tight direction. A call for a step yeah, but it's not actually a step until something is actually done

reading this, I don't really know what is supposed to be protected here to be deemed possible of protections in the first place.

closest reasonable one is the girl's "identity", so it could be fraud. but it's not used to fool people. more likely, those getting the pics already consented this is ai generated.

so might be defamation?

the image generation tech is already easily accessible so the girl's picture being easily accessible might be the weakest link?

Not a lawyer, but I'll take a stab. Pretty sure it's illegal to create sexual images of children, photos or not. It's also illegal to use someone's likeness without permission, but admittedly this depends on the state in the US: https://en.wikipedia.org/wiki/Personality_rights

Thanks for the valuable contribution to this discussion! It does appear this is a question of identity and personality rights, regarding how one wants to be portrayed.

Reading that article though, it seems like that only applies to commercial purposes. If one is making deep fakes for their own non-commercial private use, it doesn't appear personality rights apply.

Pretty sure it's illegal to create sexual images of children, photos or not.

Maybe in your distopian countries where drawings are illegal. Absolutely absurd you're promoting that as a good thing.

I'm not exactly sure what your point is. In the article, a kid created an unwanted sexual depiction of another kid and spread it around. I do think that should be illegal.

Yes but this thread is about just drawings in general. Deep faking someone into porn and spreading it around should absolutely be yourself. But it's not "child porn". It's some type of harassment or defamation or something

This is treading on some dangerous waters. Kids need to realize this is way too close to basically creating underage pornography/trafficking.

I would rather have them realize that other people are to be treated with respect.

1 more...

I think the best way to combat this is to ostracize anyone who participates in it.

Let it be a litmus test to see who is and is not worth hanging out with.

The problem with that plan is there are too many horrible people in the world. They'll just group up and keep going. Horrible people don't stop over mere inconvenience.

2 more...

These deepfakes don't disappear. You can ostracize all you like, but that won't stop these from potentially haunting girls for the rest of their lives.

I don't know what the solution is, honestly.

Why should it haunt them? Even if the images were REAL, why should it haunt them? I'm so tired of the puritanical shame women are supposed to feel about their bodies. We all have the same basic equipment. If a guy makes a deep fake, it is HE who should feel shame and humiliation for being a sick pervert. Girls need to be taught this. Band together and laugh these idiots off campus. Name and shame online. Make sure HE will be the one haunted forever.

I don't mean psychologically haunt them, I mean follow them for the rest of their lives affecting things like jobs and relationships. It doesn't matter whether or not they're fake if people don't think they're fake.

Naming and shaming who did this to them will not stop them from being fired from their schoolteaching job in 15 years when the school discovers those images. Do you think "those were fake" is going to be enough for the school corporation if it's in, for example, Arkansas?

The more common this becomes the more easily it will be dismissed.

The solution is for no one to care or make a big deal out of it, they're not real so you shouldn't care.

2 more...

It's time for the butlerian jihad.

gee here is a novel idea, dont let children have access to social media. that would solve a lot of other problems also

While I agree with that in principle, we shouldn't start blocking people, even young people, access to a lot of information.

Twitter, while now a cesspool, still has a lot of academics on it that share new ideas and discoveries.

Reddit, while shit, also has the value of helping people find niche hobbies and communities.

YouTube, while turning into shit, allows people access to video tutorials and explanations, hell while I was in school half the time teachers assigned hw that we would need to watch a YouTube video.

While it's an idea to block the youth from accessing social media, the drawbacks I think are too much.