AI-generated child sexual abuse images could flood the internet. A watchdog is calling for action

robocall@lemmy.world to Technology@lemmy.world – 80 points –
AI-generated child sexual abuse images could flood the internet. Now there are calls for action
apnews.com
27

You are viewing a single comment

Deepfakes of an actual child should be considered defamatory use of a person's image; but they aren't evidence of actual abuse the way real CSAM is.

Remember, the original point of the term "child sexual abuse material" was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse -- such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.

Purely fictional depictions, not involving any actual child being abused, are not evidence of a crime. Even deepfake images depicting a real person, but without their actual involvement, are a different sort of problem from actual child abuse. (And should be considered defamatory, same as deepfakes of an adult.)

But if a picture does not depict a crime of abuse, and does not depict a real person, it is basically an illustration, same as if it was drawn with a pencil.

Remember, the original point of the term “child sexual abuse material” was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse – such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.

although that distinction lasted about a week before the same bad actors who cancel people over incest fanfic started calling all the latter CSEM too

As a sometime fanfic writer, I do notice when fascists attack sites like AO3 under a pretense of "protecting children", yes.

And it's usually fascists, or at least people who may not consider themselves as such but think and act like fascists anyways.

Add in an extra twist. Hopefully if the sickos are at least happy with AI stuff they won't need "real"

Sadly, a lot of it does evolve from wanting to "watch" to wanting to do

Sadly, a lot of it does evolve from wanting to "watch" to wanting to do

This is the part where I disagree and I would love ppl to prove me wrong. Because whether this is true or false, it will probably be the deciding factor in allowing or restricting "artificial CSAM".

Sadly, a lot of it does evolve from wanting to "watch" to wanting to do

Have you got some source about this ?

Some actually fetishize causing suffering.

Some people are sadists and rapists, yes, regardless of what age group they'd want to do it with.