South Korea has jailed a man for using AI to create sexual images of children in a first for country's courts

alphacyberranger@sh.itjust.works to World News@lemmy.world – 547 points –
South Korea has jailed a man for using AI to create sexual images of children in a first for country's courts | CNN
edition.cnn.com
254

You are viewing a single comment

Hear me out on this one:

If we take it as given that pedophilia is a disorder and ultimatly a sickness, wouldn't it be better that these people get their fix from AI created media than from the real thing?

IMO there was no harm done to any kid in the creation of this and it would be better to give these people the fix they need or at least desperately desire in this way before they advance to more desperate and harmful measures.

You have a point, but in at least one of these cases the images used were of the girls around them and even tried extorting one is then. Issues like this should be handled on a case by case basis.

Some of the comments here are so stupid: "either they unpedophile themselves or we just kill them for their thoughts"

Ok so let me think this through. Sexual preferences in any way or pretty normal and they don't go away. Actually if you tend to ignore them they become stronger. Also being a pedophile is not a crime currently. It's the acting on it. So what happens right now is that people bottle it up, then it gets too much and they act on it in gruesome ways, because "if I go to prison I might as well make sure it was worth it". Kids get hurt.

"But we could make thinking about it illegal!" No we can't. Say that's a law, what now? If you don't like someone, they're a "pedophile". Yay more false imprisonment. Also what happens to real pedophiles? Well they start commit more acts because theres punishment even for restraint. And the truth is a lot of ppl have pedophilic tendencies. You will not catch all of them. Things will just get worse.

So why AI? Well as the commenter above me already said, if there's no victim, there's no problems. While that doesn't make extortion legal (I mean obv. it's a different law), this could make ppl with those urges have more restraint. We could even still limit it to specific sites and make it non-shareable. We'd have more control over it.

I know ppl still want the easy solution which evidently doesn't work, but imo this is a perfect solution.

I largely agree with what you're saying and there definitely is no easy solution. I've never understood why drawings or sex dolls depicting underage people are illegal in some places, as they are victimless crimes.

The issue with aigen that differentiates it a bit from the above is the fidelity. You can tell a doll or an anime isn't real, but in a few years from now it'll b difficult to spot aigen images. This isn't unique to this scenario though, it's going to wreck havok on politics, scams, etc, but there is the potential that real CP comes out from hiding and is somewhat shielded by the aigen.

Of course this is just speculation, I hope it would go the other way around and everyone just jacks off at their computers and CP disappears completely. We need to stop focusing our attention on people with pedophila, get them mental support, and focus on sex offenders who are actually hurting people.

I'm all for letting people have their dolls, drawings, and AI generated stuff, but yeah... it would become easy for offenders to say "Naw, I snatched that shit off of DALL-E." and walk in court, so some kind of forensic tool that can tell AI Generated Images from Real Ones would have to be made...

Actually there's a lot of reasons we'd want a tool like that that have nothing to do with hypothetical solutions to kiddie diddling.

Can you imagine how easy extortion would become if you could show an AI pictures of your neighbor next door killing some rando missing person in the area? But every new technology enables crime, until we find out what the proper safeguards are so I'm not too worried about it in the long-term.

It’s also worth remembering that to make super accurate pictures of children the AI may have been trained on illegal photos (on purpose or not)

But that’s an issue for the AI creator

I pretty much agree, while we should never treat Pedophilia as "Just another perfectly valid sexuality, let's throw a parade, it's nothing to be ashamed of" (Having the urge to prey on children is ABSOLUTELY something to be ashamed of even if you can't control it.), we need to face facts... It isn't someone waking up one day and saying "Wouldn't it be funny if I took little Billy out back and filled him full of cock?"

It's something going on in their head, something chemical, some misfiring of the neurons, just the way their endocrine system is built.

As much as I'd love to wave a magic wand over these people I reluctantly call people and cure them of their desires, we don't have the power to do that. No amount of therapy in the world can change someone's sexual tastes.

So in lieu of an ideal solution, finding ways to prevent pedophiles from seeking victims in the first place is the next best thing.

It's not dissimilar to how when we set up centers for drug addicted people to get small doses of what they're addicted to so that they can fight withdrawal symptoms, crimes and death rates go down. When you enact things like universal basic income and SNAP, people have less of a reason to rob banks and gas stations so we see less of them.

It's not enough to punish people who do something wrong, we need to find out why they're doing it and eliminate the underlying cause.

There's also a difference (not sure if clinically) between people who sexualize really young kids and someone who likes kids that are under the age that whatever society has decided splits children and adults. In the USA porn depicting the latter is fine as long as everyone is over the age of adulthood, even if they dress up to look younger.

I think in general people who refer to pedophilia are usually referring to the former and not the 30 year old dating a 17 year old or whatever. But the latter makes it a little weird. Images of fictional people don't have ages. Can you charge anyone who has aigen porn with csam if the people depicted sorta look underage?

Ai generated content is gonna bring a lot of questions like these that we're gonna have to grapple with as a society.

The first part of your comment is rather confusing to me, but the latter part I fully agree with. Decoding age on appearance is a thing that will haunt us even more with AI until we face new solutions. But that is gonna be one of a list of big questions to be asked in conjunction with new AI laws.

Pedo isn't a sexual preference anymore than cannibal a dietary one...

You know what? Sure. Imagine I find ppl really taste, especially hands. But I never chew on one. I just think about it. Literally the same thing. You should be rewarded for restraint on these urges. If I'd get punished for thinking about munching on a thumb, I'd at least take a hand with me to jail. I'm going there anyway.

That's basically how I feel. I'd much rather these kinds of people jack it to drawings and AI Generated images if the alternative is that they're going to go after real chidlren.

At some point the fake images won't do it for them and then they'd fix their attention to real kids. We don't want to wait for that to happen.

It's like using a drug with your threshold increasing each time you use, they're will be a time that your old limit will have no effect on your satisfaction level.

Is that proven or just bullshit speculation?

None of us are specialists here, so people saying it is and people saying it isn't harmless are both speculating

Seems like speculation, but personally I'd be amazed if it were completely incorrect.

If people who are attracted to children are constantly looking at CP, they are inevitably going to become more comfortable with it. Same with any other type of porn - do you think people who watch tons of torture porn dont become increasingly unaffected by it? It's also the same for any other illegal or shocking content. I spent enough time on 4chan 10+ years ago to vouch for this personally.

I'm not saying that everyone who looks at these AI images will act on their desire, but some people will absolutely end up wanting more after having open access to pictures of naked children.

Honestly it's a bit concerning how people are voting this down, why do we value the sexual gratification of pedos higher than the potential safety of children?

This is the War on Drugs argument all over again, except using porn instead of marijuana as a "gateway".

You're correct that there can be some crossover and some unstable people could have an addiction that gets out of control, but I don't think there's any proof that happens in high enough numbers.

By your logic, does everyone who's into bdsm have a sex dungeon in their bedroom?

Your comment reduces everyone to their base fetishes, as if that were the only thing enacting pressure on an individual to act, and I don't believe that's the case.

I'll come right out and say it, I'm into inflation.

The amount of times I've went out, bought a helium tank, and shoved a tube up anyone's ass is just about equal to the amount of times I've been the Republican Candidate for the US Presidency... and I'm not even 35 yet.

I think we all have weird kinks, it's a part of the human experience.

Heck imagine if we thought this way for EVERY sexual desire someone had.

"Porn for people who prefer blondes? I dunno, what if they get carried away and start dying random brown haired people? The consequences are too great!"

Sounds fucking ridiculous when you think of it that way.

Do you know how much porn there is of the My Little Pony characters? Tons

Do you know how much of an epidemic there is of cartoon watchers going out and fucking ponies? Somewhere between null and zilch... Maybe one or two extreme cases, but that's around the same amount of people who watch Super Hero movies and try to jump off the roof in order to fly.

This is a slippery slope fallacy if I ever saw it.

Heck, if anything we've seen that restrictions on porn actually leads to increased instances of sexual assault, in the same way a crackdown on drugs just leads to more deaths from overdoses.

If letting some sicko have fake images of pretend children saves even one real child from being viciously exploited, I think it's worth it.

It's not ideal and yeah, it makes the skin of any sane person crawl... Ideally we should be out curing pedophiles of their sexual urges entirely, but since we don't have a way to do that why let perfect be the enemy of good? I mean what other ideas do we have? Cause "To Catch A Predator" may have been good television, but even that had ethical concerns ending in lawsuits lost and suicides performed, and castrated everyone convicted isn't exactly 8th Amendment friendly... and even then that prevents repeat offenses, not initial offenses. (Prevention > Cure)

Now all this aside, we do need to look at this on a case by case basis. If real children are being used to model for the AI or fake images are used as a form of blackmail (Think "Revenge Porn", but way, way worse), then cuffs need to be slapped on people.

1 more...
1 more...

Jfc what's with these pedo apologists. If someone were a cannibal, would it be totally fine to just give them human flesh removed from surgeries or dead people? Maybe let him pay people to eat them and drink their blood? AI images are trained on actual CP and CP anyway should not be normalized. If someone has ideation of violence then the last thing you do is feed those ideations. Would you think a suicidal person should watch simulated suicide? Why would watching simulated acts of depraved violence because you enjoy them somehow prevent you from committing that act yourself? If you enjoy something that much then you are thinking about doing it yourself.

Actually the analogy here would be to give "wannabe cannibals" synthetic meat/stuff that tastes like human meat/stuff

Exxept for the source csam required to get the model started, of course.

That is not required. Especially in the larger models like a DALLE-3 it can combine concepts even without being directly trained on it. The one they had in the showcase for DALLE-2 was a chair shaped like an avocado. It knows what a chair is and it knows what an avocado is, so it can combine them. So it can know "this is what a naked human looks like" and "this is what a human child looks like" and could combine them without having ever seen CSAM.

Did somebody audit the dataset?

Ya, I thought so...

"I don't personally know what's in the data set, so it must include CP" is a breathtakingly pathetic argument. Shame on you.

So if they audit the inputs and can guarantee no naked kids, you're fine with this??

See, this is the problem with this entire thread. You guys are, rightfully, upset about what you perceive, but then you take that and use that energy to spread false claims because you think they're true, or you think the lie will help bolster your side of the argument. Instead it just makes you look disingenuous and paints a bad look for everyone. There's plenty of accurate points to be made for why this stuff isn't okay, but you guys are making none of them, very matter-of-factly. Do better. Be better. Make valid arguments, based on fact, or do what people used to do and sit back and let the people who know what they're talking about make the arguments.

1 more...
1 more...
2 more...

I dont beleive its a sickness. Humans vary in innumerable ways and defining natural variations as sickness is a social decision, not medical. If you look at the DSM you will find that that social problems are sometimes given as a reason for defining something as illness. This is just the medicalisation of everything.
Even if you grant that its a sicknesd, how does it follow that sickness should therefore be treated by AI? I see no argument or logic here. Do you think harm would be done if the paedophile knows the child? If the child finds out they are the object of rape fantasies? If you find you are married to a person who gets off on raping children? Your children?
Do you allow for disgust and horror at sadistic desires or are we 'not allowed to kink shame'.

Sex offenders aren't allowed to watch porn at all in my state.

Because science suggests watching porn, and getting your fix as you put it, through porn, encourages the behavior.

Watching child porn teaches the mind to go to children to fulfill sexual urges. Mindfulness practice has been shown to be effective in curbing urges in all forms of addiction.

So, no. Just no to your whole post.

There's effective treatment for addictions, rather sexual or otherwise. Rather the addiction feeds on children or heroin. And we don't need to see if fake child porn helps. Evidence already suggests it doesn't and we already have effective treatments that don't put children at risk and that don't encourage the behavior.

Not judging/voting your comment, do you have the data at hand? Just out of interest.

Some input though, you are not making a difference between offenders and non-offenders and i doubt there is even good data on non offenders to begin with.

As mentioned on another one of your comments, I am having a hard time finding the science you reference.

This isn't about addiction, it's about sexuality. And you can't just curb your whole sexuality away. These people have a disorder that makes them sexually attracted to children. At this point there is no harm done yet. They just are doomed to live a very unfulfilling life, because the people with whom they want to engage in sexual practices can't give their consent, which is morally and legally required, no question about that. And most of them don't give in to these urges and seek the help they need.

But still, you can't just meditate your whole sexuality away. I don't want to assume, but I bet you also masturbate or pleasure yourself in one way or another, I know I do. And when I was young, fantasy was all I needed, but then I saw my first nude and watched my first porno and it progressed from there, and I'm sure fantasy won't be enough for these people as well. So when they get to the stage where they want to consume media, I prefer it to be AI created images or some drawn hentai of a naked young girl or whatever, and not real abused children.

What do you think those AIs models are trained on?

Not child porn. AI produces images all the time of things that aren't in its training set. That's kind of the point of it.

AI produces images all the time of things that aren't in its training set.

AI models learn statistical connections from the data it's provided. It's going to see connections we can't, but it's not going to create things that are not connected to its training data. The closer the connection, the better the result.

It's a pretty easy conclusion from that that CSAM material will be used to train such models, and since training requires lots of data, and new data to create different and better models...

Real material is being used to train some models, but sugesting that it will encourage the creation of more "data" is silly. The amount required to finetune a model is tiny compared to the amount that is already known to exist. Just like how regular models haven't driven people to create even more data to train on.

Just like how regular models haven't driven people to create even more data to train on.

It has driven companies to try to get access to more data people generate to train the models on.

Like chatGPT on copyrighted books, or google on emails, docs, etc.

And what does that have to do with the production of csam? In the example given the data already existed, they've just been more aggressive about collecting it.

Well now in addition to regular pedos consuming CSAM, now there are the additional consumers of people to use huge datasets of them to train models.

If there is an increase in demand, the supply will increase as well.

Not necessarily. The same images would be consumed by both groups, there's no need for new data. This is exactly what artists are afraid of. Image generation increases supply dramatically without increasing demand. The amount of data required is also pretty negligible. Maybe a few thousand images.

8 more...

Because eventually looking at images might not be enough

Bro I've watched a lot of regular porn and never once have I gone out and thought "why yes I'd sure like to rape that person"

But you will at least have an outlet if you get yourself a partner or hire an escort. There's the prospect of sex in real life. You're not forever limited to porn.

I did not have sex in years, yet luckely nobody thinks i am a danger to women. It is nearly as if people do not suddenly feel the need to rape someone just because they dont have sex.

Slippery slope arguments almost exclusively come from the only people who they seem to affect. You see the same worrying mentality from religious people who tell you that without god they would be commiting serious crimes. Most people have inherent morality that these people seem to lack without strict legal or religious guidelines.

I mean that makes sense i guess. I hate these "arguments" because it kills the debate. On the other hand i am totally not used to see any debate on this topic without it derailing into people calling each other pedos. So props to most people in here.

By that logic almost everyone in Hollywood should be in prison for depicting violence, murder, rape etc in movies/shows etc. This argument was put to rest back in the '90s.

1 more...

That'd be like giving an alcoholic a pint by the end of the week to reward their alcoholic behavior that they'd want out of.

That'd be like giving money to a gambling addict as they promise to 'pay you back' for the loan you've given them.

My point is, enabling people's worst habits is always a bad idea.

And how can you guarantee for certain that after awhile of these AI-generated CP crap, that they eventually wouldn't want the real thing down the road and therefore, attempt crimes?

Your solution is just dumb altogether.

...Aren't drug patches already a thing for more extreme drugs? I feel like you just gave bad examples when there's actual examples that exist....

I'm not an expert on the psychology of pedophilia, but I don't think it has anything to do with addiction. It seems to be a paraphilia/disorder.

https://en.wikipedia.org/wiki/Pedophilia

I don't claim to be an expert either, but it's kind of a no-brainer to see what addiction is and what it does to people. Really simple stuff.

There is literally no data to back up your slippery slope argument.

You really like spamming that "slippery slope" term, don't you? It's like your ultimate go-to for feeling like you're superior. Just wait until you use it in a context where you'll look like a dumbass, one of these days in where it doesn't fit.

If i use such an "argument" and someone calls me out on it i hope i take the critique to heart and think of an actual argument. Everyone looks like dumbass from time to time and so will i.

No, it's like flooding the rhino horn market with fake rhino horn. Literally.

While I don’t disagree with the initial premise, image AI requires training images.

I suppose technically you could use hyper-realistic CGI CSAM, and then it could potentially be a “victimless” crime. But the chances of an AI being trained solely on CGI are basically non-existent. Photorealistic CGI is tough and takes a lot of time and skill to create from scratch. There are people whose entire careers are built upon photorealism, and their services aren’t cheap. And you’d probably need a team of artists (not just one artist, because the AI will inevitably end up learning whatever their “style” is and nothing more,) who are both capable and willing to create said images. The chances of all of those pieces falling into place are damned near 0.

Maybe you could supplement the CGI with young-looking pornstar material? There are plenty of pornstars whose entire schtick is looking young. But they definitely don’t look like children because the proportions are obviously all wrong; Children have larger heads compared to their bodies, for example. That’s not something that an adult actress can emulate simply by being flat chested. So these supplemental images could just as easily end up polluting (for lack of a better word) your AI’s training, because it would just learn to spit out images of flat chested adult women.

Generative Ai is perfectly capable of combining concepts. Teach it how do today do photorealistic underage and photorealistic porn and or can combine them together to make csam without ever being trained on actual csam

Just stop being attracted to kids you sicko

This is like telling someone to "stop liking rock music" or "stop enjoying ice cream." People don't decide what their preferences are, they just have them. If we can give pedophiles a way to release those urges without harming children that should be a good thing. Well not good, but positive in the relative sense at least.

Sounds exactly like MAP acceptance rhetoric to me.

That's because you're not very bright.

You should accept everyone for who they are!

Who they are: a fkin kiddy diddler

Isn't it less accepting and more realistically doing damage control to avoid legitimate damage?

Like sure we can shun them to the point of violence but then they just will hide it, bottle it, and we just have to hope they never actually do anything with those bottled emotions.

We're very quick to get uncomfortable about people's issues, but very slow to do anything to actually prevent those issues from leading to more serious issues.

The reason we're even going in this line of reasoning, is the possibility that it could lead to no actual children being harmed by it. And it's mainly because it has the advantage of deterring people you don't actually know have those issues, which is the #1 issue at the moment with it.

Incels already have an outlet (regular porn), and they still join far right terrorist groups because they can't pull girls in real life. Imagine incels but they also molest kids.

It's idiots like this that make me think of that story I saw the other day on lemmy. Where a disabled man was taking pictures of kids damaging his property, someone saw him and called the cops. Cops came, questioned him, found out what was up and released him, meanwhile morons in the neighborhood hear he was taking pictures of kids and got arrested, and that was enough for them to brutally beat and kill him that same night.

But do you actually have a reply to the content of the comment? You're essentially pulling a "smells like communism, it bad" instead of addressing the points brought up.

That's like telling gay dudes to stop liking dick. It's brain chemistry and neural circuits, you can't exactly just snap your fingers and be rid of the problem. Humans are complex creatures.

Obligatory "are you comparing pedophiles to LGBTQ?"

Yes, obviously. I'm not equating them, but human sexuality is a deep and multifaceted subject. We can't just make statements like "just stop enjoying this", that's not how humans work and I think we all know that.

6 more...
6 more...

Pedophilia is not akin to being gay (and kindly fuck off with that tyvm). It's akin to rape, or sexual sadism (and I mean real, violent sadism, not roleplay). It is a predatory inclination and @nxsfi is right - trying to frame it as an "orientation" does sound like MAP acceptance rhetoric.

Pedophilia can't be comparable to rape because rape is an act and pedophilia is not. Child rape is often incorrectly called pedophilia, but pedophilia itself is a mental state, and this is exactly the kind of conversation where conflating mental states with actions is entirely unacceptable.

I like girls. Does that mean I rape them too? Spaceships too. Also rape?

Children can never consent, so your analogy doesn't work

Spaceships can absolutely consent.

Does lemmy have a dragonsfuckingcars yet?

Dunno, but would it be cool with ai spaceships fucking kids?

Consent to what? Me liking them? Nobody really gets to decide what I like. I don't even decide that. If I could do that, I'd pick something super accessible and non controversial like just breathing or viewing the color green.

2 more...
8 more...

"Anyone who disagrees with me is a child rapist." That's the level of argumentation I expect from a child or a fascist.

What's more disgusting, nonces or those who say that it's fascist to ban them?

Stop veing gay, stop being trans, stop being attracted to fatties, stop being attracted to small people.

Stop trying to make LGBTP a thing

The point is, do folks wake up and select their sexual interest?

Do you believe that lgbt conversion therapy works? Because when you say "just stop being attracted to kids, you sicko" you're essentially saying that you can train someone out of their base sexual attractions. I don't think lgbt conversion therapy works, do you?

Pedophiles have no place in LGBT+. Those who say otherwise are conservatives trying to undermine LGBT+ by associating them with pedophiles.

This is not about LGBTQIA+! To strengthen the argument, one could also say "stop being hetero" or "stop being attracted to your preference". It's just not possible, same as with pedophiles to change their preference.

The crucial difference to most other forms of sexuality is that they can't engage in sexual acts with consenting people, which makes the acting on it morally and legally wrong. But Just being a pedophile isn't it they don't act upon it.

And noone is calling for pedophilia to get a place in LGBTQIA+ btw... Don't know where you got that from smh...

Stop parroting your lines and actually respond to the content of the comment ffs. You're acting the same as a conservative who ignores what folks say and yells abortion is murder until they're blue in the face. If you want to be that guy, you do you, I just won't bother wasting my time.

2 more...
2 more...
2 more...
2 more...
10 more...
34 more...