Paedophiles create nude AI images of children to extort from them, says charity | Internet safety | The Guardian

Bebo@literature.cafe to Technology@lemmy.world – 231 points –
Paedophiles create nude AI images of children to extort from them, says charity
theguardian.com

Internet Watch Foundation has found a manual on dark web encouraging criminals to use software tools that remove clothing. The manipulated image could then be used against the child to blackmail them into sending more graphic content, the IWF said.

61

You are viewing a single comment

This is an example of why every responsible parent should forbid their children from uploading any pictures of themselves online, or better yet, bar them from social media entirely. This might be a hot take here, but parents should install monitoring software on all of their children's devices and be open about it. Not doing so is negligent.

Your kids could end up on the pedo registry if they take a picture of themselves and someone changes it into porn.

We could deal with this easily by banning the distribution of porn entirely.

We could deal with this easily by banning the distribution of porn entirely.

Easily? This would cause riots.

The average coomer would rage out in their basement and nothing would happen.

If people are that addicted to porn, it's proof that it needs to be taken away.

7 more...