FBI Arrests Man For Generating AI Child Sexual Abuse Imagery

misk@sopuli.xyz to Technology@lemmy.world – 505 points –
FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
404media.co
282

You are viewing a single comment

the AI has to be trained on something first. It has to somehow know what a naked minor looks like. And to do that, well... You need to feed it CSAM.

First of all, not every image of a naked child is CSAM. This is actually been kind of a problem with automated CSAM detection systems triggering false positives on non-sexual images, and getting innocent people into trouble.

But also, AI systems can blend multiple elements together. They don't need CSAM training material to create CSAM, just the individual elements crafted into a prompt sufficient to create the image while avoiding any safeguards.

You ignored the second part of their post. Even if it didn't use any csam is it right to use pictures of real children to generate csam? I really don't think it is.

There are probably safeguards in place to prevent the creation of CSAM, just like there are for other illegal and offensive things, but determined people work around them.