Man Arrested for Creating Child Porn Using AI

db0@lemmy.dbzer0.com to News@lemmy.world – 362 points –
Man Arrested for Creating Child Porn Using AI
futurism.com

A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

291

You are viewing a single comment

You would think so but you basically are making a patch work version of the illicit actual media so it's a dark dark gray area for sure.

Hmm ok. I don't know much about AI.

Generative AI is basically just really overpowered text/image prediction. It fills in the words or pixels that make the most sense based on the data it has been fed, so to get AI generated CSAM....it had to have been fed some amount of CSAM at some point or it had to be heavily manipulated to generate the images in question.

so to get AI generated CSAM....it had to have been fed some amount of CSAM

No actually, it can combine concepts that aren't present together in the dataset. Does it know what a child looks like? Does it know what porn looks like? Then it can generate child porn without having ever had CSAM in its dataset. See the corn dog comment as an argument

Edit: corn dog

Some of the image generators have attempted to put up guard rails to prevent generating pictures of nude children, but the creators/managers haven't been able to eradicate it. There was also an investigation by Stanford University that showed that most of the really popular image generators had a not insignificant amount of CSAM in their training data and could be fairly easily manipulated into making more.

The creators and managers of these generative "AIs" have done slim to none in the way of curation and have routinely been trying to fob off responsibility to their users the same way Tesla has been doing for their "full self driving".

A dumb argument. Corn and dog were. But that's not a corn dog like what we expect when we think corn dog.

Hence it can't get what we know a corn dog is.

You have proved the point for us since it didn't generate a corn dog.

Ok makes sense. Yuck my skin crawls. I got exposed to CSAM via Twitter years ago, thankfully it was just a shot of nude children I saw and not the actual deed, but I was haunted.