South Korea has jailed a man for using AI to create sexual images of children in a first for country's courts

alphacyberranger@sh.itjust.works to World News@lemmy.world – 547 points –
South Korea has jailed a man for using AI to create sexual images of children in a first for country's courts | CNN
edition.cnn.com
254

You are viewing a single comment

If the man did not distribute the pictures, how did the government find out? Did a cloud service rat him out? Or spyware?

My guess would be he wasn't self hosting the AI network so the requests were going through a website.

The service should have NSFW detection and ban them instantly if they detect it

ChatGPT can be tricked into giving IED instructions if you ask the right way. So it could be a similar situation.

Why should it have that? Stable Diffusion websites know that most of their users are interested in NSFW content. I think the idea is to turn GPUs into cash flow, not to make sure that it is all wholesome.

I suppose they could get some kind of sex+children detector going for all generated image, but you're going to have to train that model on something, so now it's a chicken and egg problem.

He was found extorting little girls with nude pics he generated of them.

Edit: So I guess he just generated them. In that case, how'd they become public? I guess this is the problem if you don't read the article.

Earlier this month, police in Spain launched an investigation after images of underage girls were altered with AI to remove their clothing and sent around town. In one case, a boy had tried to extort one of the girls using a manipulated image of her naked, the girl’s mother told the television channel Canal Extremadura.

That was another case in Spain. Not the guy in Korea. The person in Korea didn’t distribute the images.

I really gotta wonder what the difference is between prosecuting someone for their thoughts and prosecuting them for jerking it to their own artwork/generative whatever they kept entirely to themselves. The only bad I see here is someone having their privacy invaded by someone else bigger than them and being put on display for it. Sounds familiar?

Why the fuck isn't that the headline? Jesus, that's really awful and changes everything.

Because that was another case. Extortion and blackmail (and in this case would count as production of cp as would be the case if you would draw after a real child) are already illegal. On this case we simply dont have enough information.