Man Arrested for Creating Child Porn Using AI

db0@lemmy.dbzer0.com to News@lemmy.world – 362 points –
Man Arrested for Creating Child Porn Using AI
futurism.com

A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

300

This creates a significant legal issue - AI generated images have no age, nor is there consent.

The difference in appearance between age 16 and 18 is minimal, but the legal difference is immense. This is based entirely on a concept that cannot apply.

How do you define what's depicting a fictional child? Especially without including real adults? I've met people who believe that preferring a shaved pubic area is pedophilia. This is even though the vast majority of adult women do so. On the flip side, teenagers from the 70s and 80s would be mistaken for 40+ today.

Even the extremes aren't clear. Adult star "Little Lupe", who was 18+ in every single appearance, lacked most secondary sex characteristics. Experts testified in court that she could not possibly be an adult. Except she was, and there's full documentation to prove it. Would AI trained exclusively on her work be producing CSAM?

To paraphrase someone smarter than me, "I'll know it when I see it."

But naturally I don't want to see it. One of the things I miss the least about reddit is the constant image posts of anime characters, who may be whatever age they say but which are clearly representative of very young girls with big tiddies bolted on. It's gross, but it is also a problem thatsl's more widespread and nebulous than most people are willing to admit.

"I'll know it when I see it."

I can't think of anything scarier than that when dealing with the legality of anything.

I'm nearly 40 and still regularly get carded while other people out with me do not so it's not just "we card everyone". People are bad at judging age.

Are you in an age gap relationship? I find that my ID gets checked more if I'm in any kind of age gap, I assume due to curiosity. Do you get carded the same amount if you are alone?

Nope I'm not in a relationship ship at all. This is while out with friends but I'm usually the first one there or with one other dude who is about the same age as me so nothing to speak of there. It happens regardless of if I'm alone or not. I got carded at the grocery store checkout last week and she seemed genuinely shocked at my age.

Sometimes something cant have a perfect definition. What's the difference between a gulf, a bay, and a channel? Where does the shore line become a beach? When does an arid prairie become a desert? How big does a town have to grow before it becomes a city? At what point does a cult bevota religion?When does a murder become premeditated vs a crime of passion? When does a person become too drunk to give active consent? Human behavior is a million shades of gray, just like everytbing else we do, and the things that don't fit into our clear definitions are where the law needs to be subjective.

First batch don't have anything to do with what can get you arrested.

Beyond that, a crime of passion is legally defined as:

In criminal law, a crime of passion is a crime committed in the "heat of passion" or in response to provocation, as opposed to a crime that was premeditated or deliberated.

Premeditated murder though is:

Premeditation is when an individual contemplates, for any length of time, the undertaking of an activity and then subsequently takes the action.

As for too drunk to consent, you can read "Victim Intoxication and Capacity to Consent in Sexual Assault Statutes across the United States"

The result is the same, when you are dealing with a legal matter, you need real definitions, or you have laws that are too vague and overly ripe for abuse. It becomes the legal roulette of "is the cup half full or half empty" with someones life and/or future in the balance

Just when trying to guess someone's age (we'll assume completely family-friendly and above board), think back to high school. How old did you and your peers look? Now go take a look at high schoolers today. They probably seem a lot younger than you did. The longer it's been (i.e. the older you are), the younger they look. Which means, "when I see it" depends entirely on the age of the viewer.

This isn't even just about perception and memory- modern style is based on/influenced heavily by youth. It's also continuing to move in the direction. This is why actors in their 30s - with carefully managed hair, skin, makeup, and wardrobe - have been able to convincingly portray high schoolers. This means that it's not just you - teens really are looking younger each year. But they're still the same age.

Wtf. Style is what makes kids look young or old to us because we have been heavily marketed to and follow trends. That's why when the mullet/porn stache style came back, those Dahmer kids looked in their 40s.

You're getting older each year so teens look younger to you.

Name even one actor in their thirties who convincingly played a high schooler. Literally who

I don't see how children were abused in this case? It's just AI imagery.

It's the same as saying that people get killed when you play first person shooter games.

Or that you commit crimes when you play GTA.

Not a great comparison, because unlike withh violent games or movies, you can't say that there is no danger to anyone in allowing these images to be created or distributed. If they are indistinguishable from the real thing, it then becomes impossible to identify actual human victims.

There's also a strong argument that the availability of imagery like this only encourages behavioral escalation in people who suffer from the affliction of being a sick fucking pervert pedophile. It's not methadone for them, as some would argue. It's just fueling their addiction, not replacing it.

The difference is intent. When you're playing a FPS, the intent is to play a game. When you play GTA the intent is to play a game.

The intent with AI generated CSAM is to watch kids being abused.

Whose to say there aren't people playing games to watch people die?

There may well be the odd weirdo playing Call of Duty to watch people die.

But everyone who watches CSAM is watching it to watch kids being abused.

When you're playing a FPS, the intent is to watch people being murdered.

How is this argument any different?

Punishing people for intending to do something is punishing them for thought crimes. That is not the world I want to live in.

This guy did do something - he either created or accessed AI generated CSAM.

I'm not talking about "this guy". I'm talking about what you just said.

Intent is defined as intention or purpose. So I'll rephrase for you: the purpose of playing a FPS is to play a game. The purpose of playing GTA is to play a game.

The purpose of AI generated CSAM is to watch children being abused.

I don't think that's fair. It could just as well be said that the purpose of violent games is to simulate real life violence.

Even if I grant you that the purpose of viewing CSAM is to see child abuse, it's still less bad than actually abusing them just like playing violent games is less bad than participating in real violence. Also, despite the massive increase in violent games and movies, the actual number of violence is going down so implying that viewing such content would increase the cases of child abuse is an assumption I'm not willing to make either.

The purpose of a game is to play a game through a series of objectives and challenges.

Even if I grant you that the purpose of viewing CSAM is to see child abuse

Very curious to hear what else you think the purpose of watching CSAM might be.

it’s still less bad than actually abusing them

"less bad" is relative. A bad thing is still bad. If we go by length of sentencing then rape is 'less bad' than murder. that doesn't make it 'not bad'.

so implying that viewing such content would increase the cases of child abuse is an assumption I’m not willing to make either.

OK?

I didn't claim that AI CSAM increased anything at all. Literally all I've said is that the purpose of AI generated CSAM is to watch kids being abused.

Neither did I claim that violent games lead to violence. You invented that strawman all by yourself.

A person said that there is no victim in creating simulated CSAM with AI just like there isn't one in video games, to which you replied that the difference there is intention. The intention to play violent games is to play games when as with viewing CSAM it's that your intention is to view abuse material.

Correct so far?

Ofcourse the intent is that. For what other reason would anyone want to see CSAM for, than to see CSAM? What kind of argument / conclusion is this supposed to be? How else am I supposed to interpret this than as you advocating for the crimimalization of creating such content despite the fact that no one is being harmed? How is that not pre-emptively punishing people for crimes they've yet to even commit? Nobody chooses to be born with such thoughts or desires, so I don't see the point of punishing anyone for that alone.

I've literally got no idea what you're talking about or what your point is. Are you saying this person hasn't committed a crime? Because that's incorrect. Lots of jurisdictions have laws preventing things like CSAM generated imagery, deepfake porn and a whole raft of other things. 'Harm' doesn't begin and end with something done to an individual for a lot of crimes.

Are you saying this person hasn’t committed a crime?

Yes, and if the law is interpretet in a way that it is considered illegal, and the person is punished for it, then that's a moral injustice and the kind of senselessness we as humans should grow out of. The fact that this "crime" has no victim is the whole point of why punishing for it makes no sense.

CSAM is illegal for a very good reason; producing it without abusing children is by definition impossible. By searching for and viewing such content, the person becomes part of the causal chain that leads to it being produced in the first place. By criminalizing it we attempt to deter people from looking for it and thus bringing down the demand and disincentivizing the production of it.

Using AI that is not trained on such content is out of this loop. There is literally nobody being harmed if someone decides to use it to create depictions of such content. It's not actual CSAM it's producing. By the very definition it cannot be. Not any more than shooting a person in a video game is a murder. CSAM stands for Child Sexual Abuse Material (I hate even saying that) so in other words; proof of the crime having happened. AI generated images are fiction. Nobody is being harmed. It's just a more photorealistic version of a drawing. Treating it as actual CSAM in the court is insanity.

Now. If the AI has been trained on actual CSAM and especially if the output simulates real people, then that's a whole another discussion to be had. This is however not what we're talking about here.

It's just AI imagery.

Fantasising about sexual contact with children indicates that this person might groom children for real, because they have a sexual interest in doing so. As someone who was sexually assaulted as a child, it's really not something that needs to happen.

indicates that this person might groom children for real

But unless they have already done it, that's not a crime. People are prosecuted for actions they commit, not their thoughts.

I agree, this line of thinking quickly spirals into Minority Report territory.

It will always be a gray area, and should be, but there are practical and pragmatic reasons to ban this imagery no matter its source.

Seems like then fantasizing about shooting people or carjacking or such indcates that person might do that activity for real to. There are a lot of car jackings nowadays and you know gta is real popular. mmmm. /s but seriously im not sure your first statement has merit. Especially when you look at where to draw the line. anime. manga. oil paintings. books. thoughts in ones head.

If you're asking whether anime, manga, oil paintings, and books glorifying the sexualization of children should also be banned, well, yes.

This is not comparable to glorifying violence, because real children are victimized in order to create some of these images, and the fact that it's impossible to tell makes it even more imperative that all such imagery is banned, because the existence of fakes makes it even harder to identify real victims.

It's like you know there's an armed bomb on a street, but somebody else filled the street with fake bombs, because they get off on it or whatever. Maybe you'd say making fake bombs shouldn't be illegal because they can't harm anyone. But now suddenly they have made the job of law enforcement exponentially more difficult.

Sucks to be law enforcement then. I'm not giving up my rights to make their jobs easier. I hate hate HATE the trend towards loss of privacy and the "if you didn't do anything wrong then you have nothing to hide" mindset. Fuck that.

If you want to keep people who fantasise about sexually exploiting children around your family, be my guest. My family tried that, and I was raped. I didn't like that, and I have drawn my own conclusions.

yeah and if you want to keep people who fantasize about murdering folk. you can't say one thing is a thing without saying the other is. Im sorry you were raped but I doubt it would be stopped by banning lolita.

I don't recall Nabokov's novel Lolita saying that sexualising minors was an acceptable act.

Thanks for the strawman, though, I'll save it to burn in the colder months.

You can call it a strawman but doing something evil if its killing folks or raping folks the effect should be the same when discussing non actual and actual. You can say this thing is a special case but when it comes to freedom of speech, which is anything that is not based in actual events. writing, speaking, thinking, art. Special circumstances becomes a real slippery slope (which can also be brought up as a fallacy which like all "fallacies" depend a lot on what else backs them up on how they are being presented)

Well, the image generator had to be trained on something first in order to spit out child porn. While it may be that the training set was solely drawn/rendered images, we don't know that, and even if the output were in that style, it might very well be photorealistic images generated from real child porn and run through a filter.

An AI that is trained on children and nude adults can infer what a nude child looks like without ever being trained specifically with those images.

Your argument is hypothetical. Real world AI was trained on images of abused childen.

https://cyber.fsi.stanford.edu/news/investigation-finds-ai-image-generation-models-trained-child-abuse

Only because real world AI was trained on the dataset of ALL PUBLIC IMAGES, dumbass

So you're admitting they are correct?

No, I'm admitting they're stupid for even bringing it up.

Unless their argument is that all AI should be illegal, in which case they're stupid in a different way.

Do you think regular child porn should be illegal? If so, why?

Generally it's because kids were harmed in the making of those images. Since we know that AI is using images of children being harmed to make these images, as the other posters has repeatedly sourced (but also if you've looked up deepfakes, most deepfakes are of an existing porn and the face just changed over top. They do this with CP as well and must use CP videos to seed it, because the adult model would be too large)... why does AI get a pass for using children's bodies in this way? Why isn't it immoral when AI is used as a middle man to abuse kids?

Since we know that AI is using images of children being harmed to make these images

As I keep saying, if this is your reasoning then all AI should be illegal. It only has CP in its training set incidentally, because the entire dataset of images on the internet contains some CP. It's not being specifically trained on CP images.

You failed to answer my questions in my previous comment.

Ok, if you insist...yes, CP should be illegal, since a child was harmed in its making. It can get a bit nuanced (for example, I don't like that it can be illegal for underage people to take pictures of their own bodies) but that's the gist of it.

That's not all of the questions I asked

They do this with CP as well and must use CP videos to seed it, because the adult model would be too large)… why does AI get a pass for using children’s bodies in this way? Why isn’t it immoral when AI is used as a middle man to abuse kids?

2 more...
2 more...
2 more...
2 more...
2 more...
2 more...
2 more...
2 more...

Yes exactly. That people are then excusing this with "well it was trained on all.public images," are just admitting you're right and that there is a level of harm here since real materials are used. Even if they weren't being used or if it was just a cartoon, the morality is still shaky because of the role porn plays in advertising. We already have laws about advertising because it's so effective, including around cigarettes and prescriptions. Most porn, ESPECIALLY FREE PORN, is an ad to get you to buy other services. CP is not excluded from this rule - no one gets free lunch, so to speak. These materials are made and hosted for a reason.

The role that CP plays in most countries is difficult. It is used for blackmail. It is also used to generate money for countries (intelligence groups around the world host illegal porn ostensibly "to catch a predator," but then why is it morally okay for them to distribute these images but no one else?). And it's used as advertising for actual human trafficking organizations. And similar organizations exist for snuff and gore btw. And ofc animals. And any combination of those 3. Or did you all forget about those monkey torture videos, or the orangutan who was being sex trafficked? Or Daisy's Destruction and Peter Scully?

So it's important to not allow these advertisers to combine their most famous monkey torture video with enough AI that they can say it's AI generated, but it's really just an ad for their monkey torture productions. And even if NONE of the footage was from illegal or similar events and was 100% thought of by AI - it can still be used as an ad for these groups if they host it. Cartoons can be ads ofc.

2 more...
2 more...

How many corn dogs do you think were in the training data?

Wild corn dogs are an outright plague where I live. When I was younger, me and my buddies would lay snares to catch to corn dogs. When we caught one, we'd roast it over a fire to make popcorn. Corn dog cutlets served with popcorn from the same corn dog is popular meal, especially among the less fortunate. Even though some of the affluent consider it the equivalent to eating rat meat. When me pa got me first rifle when I turned 14, I spent a few days just shooting corn dogs.

It didn't generate what we expect and know a corn dog is.

Hence it missed because it doesn't know what a "corn dog" is

You have proven the point that it couldn't generate csam without some being present in the training data

I hope you didn't seriously think the prompt for that image was "corn dog" because if your understanding of generative AI is on that level you probably should refrain from commenting on it.

Prompt: Photograph of a hybrid creature that is a cross between corn and a dog

Then if your question is "how many Photograph of a hybrid creature that is a cross between corn and a dog were in the training data?"

I'd honestly say, i don't know.

And if you're honest, you'll say the same.

But you do know because corn dogs as depicted in the picture do not exists so there couldn't have been photos of them in the training data, yet it was still able to create one when asked.

This is because it doesn't need to have been seen one before. It knows what corn looks like and it knows what a dog looks like so when you ask it to combine the two it will gladly do so.

But you do know because corn dogs as depicted in the picture do not exists so there couldn't have been photos of them in the training data, yet it was still able to create one when asked.

Yeah, except photoshop and artists exist. And a quick google image search will find them. 🙄

And this proves that AI can't generate simulated CSAM without first having seen actual CSAM how, exactly?

To me, the takeaway here is that you can take a shitty 2 minute photoshop doodle and by feeding it thru AI it'll improve the quality of it by orders of magnitude.

I wasn't the one attempting to prove that. Though I think it's definitive.

You were attempting to prove it could generate things not in its data set and i have disproved your theory.

To me, the takeaway here is that you can take a shitty 2 minute photoshop doodle and by feeding it thru AI it'll improve the quality of it by orders of magnitude.

To me, the takeaway is that you know less about ai than you claim. Much less. Cause we have actual instances and many where csam is in the training data. Don't believe me?

Here's a link to it

You were attempting to prove it could generate things not in its data set and i have disproved your theory.

I don't understand how you could possibly imagine that pic somehow proves your claim. You've made no effort in trying to explain yourself. You just keep dodging my questions when I ask you to do so. A shitty photoshop of a "corn dog" has nothing to do with how the image I posted was created. It's a composite between a corn and a dog.

Generative AI, just like a human, doesn't rely on having seen an exact example of every possible image or concept. During its training, it was exposed to huge amounts of data, learning patterns, styles, and the relationships between them. When asked to generate something new, it draws on this learned knowledge to create a new image that fits the request, even if that exact combination wasn't in its training data.

Cause we have actual instances and many where csam is in the training data.

If the AI has been trained on actual CSAM and especially if the output simulates real people, then that’s a whole another discussion to be had. This is however not what we’re talking about here.

Generative AI, just like a human, doesn't rely on having seen an exact example of every possible image or concept

If a human has never seen a dog before, they don't know what it is or what it looks like.

If it's the same as a human, it won't be able to draw one.

we don't know that

might

Unless you're operating under "guilty until proven innocent", those are not reasons to accuse someone.

2 more...

How was the model trained? Probably using existing CSAM images. Those children are victims. Making derivative images of “imaginary” children doesn’t negate its exploitation of children all the way down.

So no, you are making false equivalence with your video game metaphors.

A generative AI model doesn't require the exact thing it creates in its datasets. It most likely just combined regular nudity with a picture of a child.

In that case, the images of children were still used without their permission to create the child porn in question

That's not really a nuanced take on what is going on. A bunch of images of children are studied so that the AI can learn how to draw children in general. The more children in the dataset, the less any one of them influences or resembles the output.

Ironically, you might have to train an AI specifically on CSAM in order for it to identify the kinds of images it should not produce.

Why does it need to be “ nuanced” to be valid or correct?

Because the world we live in is complex, and rejecting complexity for a simple view of the world is dangerous.

See You Can’t Get Snakes from Chicken Eggs from the Alt-Right Playbook.

(Note I’m not accusing you of being alt-right. I’m saying we cannot ignore nuance in the world because the world is nuanced.)

That's a whole other thing than the AI model being trained on CSAM. I'm currently neutral on this topic so I'd recommend you replying to the main thread.

How is it different?

It's not CSAM in the training dataset, it's just pictures of children/people that are already publicly available. This goes on to the copyright side of things of AI instead of illegal training material.

It’s images of children used to make CSAM. No matter of your mental gymnastics can change that, nor the fact that those children’s consent was not obtained.

Why are you trying so hard to rationalize the creation of CSAM? Do you actually believe there is a context in which CSAM is OK? Are you that sick and perverted?

Because it really sounds like that’s what you’re trying to say, using copyright law as an excuse.

It's every time with you people, you can't have a discussion without accusing someone of being a pedo. If that's your go-to that says a lot about how weak your argument is or what your motivations are.

It’s hard to believe someone is not a pedo when they advocate so strongly for child porn

You're just projecting your unwillingness to ever take a stance that doesn't personally benefit you.

Some people can think about things objectively and draw a conclusion that makes sense to them without personal benefit being a primary determinant of said conclusion.

You're just projecting your unwillingness to ever take a stance that doesn't personally benefit you.

I’m not the one here defending child porn

10 more...
10 more...

its hard to argue with someone who believes the use of legal data to create more data is ever illegal.

10 more...
10 more...

Lol you don't understand that the faces AI generated are not real. In any way.

I am not trying to rationalize it, I literally just said I was neutral.

How are you neutral about child porn? The vast majority of everyone on this planet is very much against it.

I'm not neutral about child porn, I'm very much against it, stop trying to put words in my mouth. I'm talking about this kind of use of AI could be in the very same category of loli imagery, since these are not real child sexual abuse material.

I'm not neutral about child porn

Then why are you defending it?

10 more...
10 more...
10 more...
10 more...

Good luck convincing the AI advocates of this. They have already decided that all imagery everywhere is theirs to use however they like.

10 more...
10 more...

Can you or anyone verify that the model was trained on CSAM?

Besides a LLM doesn't need to have explicit content to derive from to create a naked child.

You’re defending the generation of CSAM pretty hard here in some vaguely “but no child we know of” being involved as a defense.

I just hope that the Models aren't trained on CSAM. Making generating stuff they can fap on ""ethical reasonable"" as no children would be involved. And I hope that those who have those tendancies can be helped one way or another that doesn't involve chemical castration or incarceration.

While i wouldn't put it past Meta&Co. to explicitly seek out CSAM to train their models on, I don't think that is how this stuff works.

But the AI companies insist the outputs of these models aren't derivative works in any other circumstances!

10 more...
19 more...

Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go "ok, itch scratched", and tank the demand for the real stuff.

Depending on which way it goes, it could be massively helpful for protecting kids. I just don't have a sense for what the effect would be, and I've never seen any experts weigh in.

Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go “ok, itch scratched”, and tank the demand for the real stuff.

From bits/articles I've seen here and there over the years about other things that are kind of in the same category (porn comics with child characters in them, child-shaped sex dolls), the latter seems to be more the case.

I'm reminded of when people were arguing that when Internet porn became widespread, the incidence of rape would go through the roof. And then literally the opposite happened. So...that pushes me toward hypothesizing that the latter is more likely to be the case, as well.

In Australia cartoon child porn is enforced in the same way as actual child porn. Not that it answers your question but it's interesting.

I'd imagine for your question "it depends", some people who would have acted on their urges may get their jollies from AI child porn, others who have never considered being pedophiles might find the AI child porn (assuming legal) and realise it's something they were into.

I guess it may lower the production of real child porn which feels like a good thing. I'd hazard a guess that there are way more child porn viewers than child abusers.

In Australia a 30 year old woman cannot be in the porn industry if she has small breasts. That, and the cartoon ban both seem like overcompensating.

Nothing says "we're protecting children" like regulating what adult women can do with their bodies.

Conservatives are morons, every time.

They're not morons.

Any time anyone ever says they want to do anything "to protect the children" you should assume it's about control. No one actually gives a shit about children.

I seem to remember Sweden did a study on this, but I don't really want to google around to find it for you. Good luck!

I'd like to know what psychologists think about it. My assumption is the former, it escalates their fantasizing about it and makes them more likely to attack a child.

There seems to be no way to conduct that experiment ethically, though.

2 more...

Real question: "do we care if AI child porn is bad?" Based on most countries' laws, no.

There's like a lot of layers to it.

  • For some, it might actually work in the opposite direction, especially if paried with the wrong kind of community around it. I used to moderate anime communities, the amount of loli fans wanting to lower the age of consent to 12 or even lower was way too high, but they only called people opposed to loli as "real predators", because they liked their middle-school tier arguments (which just further polarized the fandom when the culture wars started).
  • Even worse might be the more realistic depictions might actually work against that goal, while with (most) loli stuff, at least it's obvious it's drawn.
  • An often overseen issue is, data laundering. Just call your real CP AI generated, or add some GAI artifacts to your collection. Hungary bans too realistic drawings and paintings of that kind, because people even did that with traditional means, by creating as realistic tracings as possible (the calling CP "artistic nudes" didn't work out here at least).

There definitively is opportunity in controlled treatment. But I believe outside of that there are too many unknowns.

You're missing the point. They don't care what's more or less effective for helping kids. They want to punish people who are different. In this case nobody is really going to step up to defend the guy for obvious reasons. But the motivating concept is the same for conservatives.

There are literally mountains of evidence that suggest that normalizing child abuse in any fashion increases the rate at which children are actually abused, but it never stops there from being a highly upvoted comment suggesting that jacking it to simulated kids is some how a "release valve" for actual pedophilia, which makes absolutely no fucking sense given everything we know about human sexuality.

If this concept were true, hentai fans would likely be some of the most sexually well-adjusted people around, having tons of experience releasing their real-world sexual desires via a virtual medium. Instead, we find that these people objectify the living shit out of women, because they've adopted an insanely overidealized caricature of what a woman should look and act like that is completely divorced from reality.

Wikipedia seems to suggest research is inconclusive whether consuming CSAM increases the likelihood of committing abuse.

Depending on which way it goes, it could be massively helpful for protecting kids

Weeeelll, only until the AI model needs more training material..

You need more training material to train a new AI. Once the AI is there, it produce as many pictures as you want. And you can get good results even with models that can be run locally on a regular computer.

I'm not sure if that is how it would work? But this is exactly the kind of thinking we need. Effects: intended plus unintended equals ???

2 more...

Could this be considered a harm reduction strategy?

Not that I think CSAM is good in any way, but if it saves a child would it be worthwhile? Like if these pedos were to use AI images instead of actual CSAM would that be any better?

I've read that CSAM sites on the dark web number into the hundreds of thousands. I just wonder if it would be a less harmful thing since it's such a problem.

Many years ago (about 25) I read an article in a newspaper (idk the name, but it may have been the The Computer Paper, which is archived on line someplace}. This article noted that a study had been commissioned to show that cp access increases child abuse. The study seemed to show the opposite.

Here's the problem with even AI generated cp: It might lower abuse in the beginning, but with increased access it would 'normalise' the perception of such conduct. This would likely increase abuse over time, even involving persons who may not have been so inclined otherwise.

This is all a very complex. A solution isn't simple. Shunning things in anyway won't help though, and that seems to be the current most popular way to deal with the issue.

Actual pedophiles (a lot of CSA is abuse of power, not pedophilia - though to be clear fuck abusers either way) have a high rate of suicidal ideation because they think its as fucked up as everyone else. Of course we can't just say "sure AI material is legal now" but I could imagine a regulated system accessed via doctors akin to how controlled substances work.

People take this firm "kill em all" stance but these people just feel the way they do same as I do towards women or a gay man feels toward men. It just is what it is - we all generally agree gay isnt a choice and this is no different. As long as they dont act on it, I think we should be sympathetic and be open to helping them live a less tortured life.

I'm not 100% saying this is how we do it, but we should be open to exploring the issue instead of full stop demonization.

Dan Savage coined the term "gold star pedophile" in a column years ago, referring to people who acknowledge their attraction to children but never act on it by harming a child or accessing CSAM. I do feel bad for these people because there are no resources to help them. The only way they can access actual therapeutic resources for their condition is by offending and going to jail. If the pedophile goes to a therapist and confesses attraction to children, therapists are mandated reporters and will assume they're going to act on it. An article I read a few years back interviewed members of an online community of non-offending pedophiles who essentially made their own support group since no one else will help them, and nearly all research on them is from a forensic (criminal) context.

There's a pretty good article by James Cantor talking about dealing with pedophiles in a therapeutic context here.

Don't get me wrong - I think offenders need to be punished for what they do. I unfortunately have a former best friend who has offended. He's no longer in my life and never will be again. But I think we could prevent offenders from reaching that point and hurting someone if we did more research and found ways to stop them before it happened.

We really gotta flip the standard and make therapist sessions 100% confidential. We should encouraging people to seek help in stopping their bad behavior, no matter what it is, and they're less likely to do that if they think a therapist could report them.

You're asking therapists to live with that information. It's not so easy to hear that a child is being actively raped and not legally being allowed to report it.

We already lose tons of social workers in CPS because they can't help those kids much or save them. Most normal adults can't really mentally handle child torture without doing something about it. How many unreported child abuse cases before a therapist kills themselves?

Let alone that you're sentencing a child to live in a rape nightmare, something most adults can't tolerate, all so their abuser can get some help maybe. Wonder how many kids will kill themselves. What the actual fuck. Here's a hint: kids are slaves, so passing laws that disempower them even more is really fucked up.

I mean, how many children get abused because people are too afraid to seek help? It's not an area with an easy answer, and I don't have hard numbers on how much harm either scenario would produce.

Well, if all the therapists kill themselves, then that system will be worse than the current one because no one will be getting help

A practically guaranteed scenario, no doubt.

I agree for the most part, particularly that we should be open minded.

Obviously we don't have much reliable data, which I think is critically important.

The only thing I world add is that, I'm not sure treating a desire for CSAM would be the same as substance abuse. Like "weaning an addict off CSAM" seems like a strange proposition to me.

Maybe I was unclear, when I relate potential generated material to controlled substances, I mean in relation to how you obtain it.

You go see a psych, probably go through some therapy or something, and if they feel it would be beneficial you would be able to get material via strictly controlled avenues like how you need a prescription for xanax and its a crime to sell or share it.

(and I imagine..... Some sort of stamping whether in the imagery or in the files to trace any leaked material back to the person who shared it, but thats a different conversation)

"Normalized" violent media doesn't seem to have increased the prevalence of real world violence.

I actually think video games reduce crime in general. Bad kids are now indoors getting thier thrills.

That makes sense. I don't know what a better answer is, just thinking out loud.

You would think so but you basically are making a patch work version of the illicit actual media so it's a dark dark gray area for sure.

Hmm ok. I don't know much about AI.

Generative AI is basically just really overpowered text/image prediction. It fills in the words or pixels that make the most sense based on the data it has been fed, so to get AI generated CSAM....it had to have been fed some amount of CSAM at some point or it had to be heavily manipulated to generate the images in question.

so to get AI generated CSAM....it had to have been fed some amount of CSAM

No actually, it can combine concepts that aren't present together in the dataset. Does it know what a child looks like? Does it know what porn looks like? Then it can generate child porn without having ever had CSAM in its dataset. See the corn dog comment as an argument

Edit: corn dog

Some of the image generators have attempted to put up guard rails to prevent generating pictures of nude children, but the creators/managers haven't been able to eradicate it. There was also an investigation by Stanford University that showed that most of the really popular image generators had a not insignificant amount of CSAM in their training data and could be fairly easily manipulated into making more.

The creators and managers of these generative "AIs" have done slim to none in the way of curation and have routinely been trying to fob off responsibility to their users the same way Tesla has been doing for their "full self driving".

A dumb argument. Corn and dog were. But that's not a corn dog like what we expect when we think corn dog.

Hence it can't get what we know a corn dog is.

You have proved the point for us since it didn't generate a corn dog.

Ok makes sense. Yuck my skin crawls. I got exposed to CSAM via Twitter years ago, thankfully it was just a shot of nude children I saw and not the actual deed, but I was haunted.

I guess my question is does access to regular porn make people not want to have real sex with another person? Does it 'scratch the itch' so to speak? Could they go the rest of their life with only porn to satisfy them?

It depends on the person. I feel like most people would be unsatisfied with only porn, but that's just anecdotal.

I honestly think ai generated csam isn't something the world needs to be produced. It's not contributing to society in any meaningful ways and pedophiles who don't offend or hurt children need therapy, and the ones who do need jailtime(and therapy, but Im in the US so thats a whole other thing). They don't 'need' porn.

My own personal take is that giving pedophiles csam that's AI generated is like showing alcohol ads to alcoholics. Or going to the strip club if you're a sex addict. It's probably not going to lead to good outcomes.

Think of it this way - what if the government said one day: "All child porn made in the before this date is legal, all child porn made after this date is illegal".

You would end up with a huge corpus of "legal" child porn that pedophiles could use as a release, but you could become draconian about the manufacture of new child porn. This would, theoretically, discourage new child porn from being created, because the risk is too high compared to the legal stuff.

Can you see the problem? That's right, in this scenario, child porn is legal. That's fucked up, and we shouldn't do that, even if it is "simulated", because fuck that.

You definitely have a good point. I was just thinking hopefully to reduce harm but obviously I don't want it to be legal.

by the same metric, i wonder why not let convicts murderers and psichopaths work at Slaughterhouses

On the other hand, are people who work at slaughterhouses more likely to be murderers and psychopaths?

1 more...

Hey, remember that terrible thing everyone said would happen?

It's happening.

If this thread (and others like it) have taught me aulnything is that facts be damned, people are opinionated either way. Nuance means nothing and it's basically impossible to have a proper discussion when it comes to wedge issues or anything that can be used to divide people. Even if every study 100% said Ai generated csam always led to a reduction in actual child harm and reduced recidivism and never needed any actual real children to be used as training material, the comments would still pretty much look the same. If the studies showed the exact opposite, the comments would also be the same. Welcome to the internet. I hope you brought aspirin.

My man. Go touch some grass. This place is no good. Not trying to insult you but it's for your mental health. These Redditors aren't worth it.

Actually. I needed that. Thanks. Enough internet for me today.

A lot of the places I've been to start conversation have been hostile and painful. If there is one thing that stands out that's holding Lemmy back it's the shitty culture this place can breed.

I'm convinced that a lot can be inferred from the type of reactions and the level of hostility one might receive by trying to present a calm and nuanced argument to a wedge topic. Even if it's not always enjoyable. At the very least it also shows others that they may not be interacting rational actors when one gets their opponents to go full mask-off.

I was hoping to comment on this post multiple times today after I initially lost track of It and now I see you've covered about 75% of what I wanted to say. I'll post the rest elsewhere out of politeness. Thank you

To be clear, I am happy to see a pedo contained and isolated from society.

At the same time, this direction of law is something that I don't feel I have the sophistication to truly weigh in on, even though it invokes so many thoughts for me.

I hope we as a society get this one right.

It's not really children on these pics. We can't condemn people for things that are not illegal yet

It's Florida. They will simply book him and then present him a deal for "only x years prison", which he'll take and therefore prevent this from going to court and actually be ruled upon.

Yeah people thinking this will make it to SCOTUS are dreaming, unless he happens to be rich.

I've always wondered the same when an adult cop pretends to be a kid only to catch pedos. Couldn't a lawyer argue that because there actually wasn't a child, there wasn't a crime?

It’s not really children on these pics.

You are certain about this? If so, where are you getting that info, because it's not in the article?

Generative image models frequently are used for the "infill" capabilities, which is how nudifying apps work.

If he was nudifying pictures of real kids, the nudity may be simulated, but the children are absolutely real, and should be considered victims of child pornography.

He wasn’t arrested for creating it, but for distribution.

If dude just made it and kept it privately, he’d be fine.

I’m not defending child porn with this comment.

I’m not defending child porn with this comment.

This is one of those cases where, even if you're technically correct, you probably shouldn't say out loud how you personally would get away with manufacturing child porn, because it puts people in the position of imagining you manufacturing child porn.

Show me multiple (let's say 3+) small-scale independent academic studies or 1-2 comprehensive & large academic studies that support one side or another and I may be swayed, Otherwise I think all that is being accomplished is that one guys life is getting completely ruined for now and potentially forever over some fabrications and as a result he may or may not get help, but I doubt he'll be better off.

—My understanding was that csam has it's legal status specifically because there are victims that are hurt by these crimes and possession supports a broader market that faciltates said harm to these victims. It's not as easy to make a morality argument (especially a good one) for laws that effect everybody when there are no known victims.

Dude is gonna get fucked, but someone had to be the test case. Hopefully this gets some legal clarity.

Are you stupid? Something has to be in the training model for any generation to be possible. This is just a new way to revitalise kids

So are you suggesting they can get an unaltered facial I.D. of the kids in the images? —Because that makes it regular csam with a specific victim (as mentioned), not an ai generative illustration.

No, I am telling you csam images can't be generated by an algorithm that hasn't trained on csam

That's patently false.

I'm not going to continue to entertain this discussion but instead I'm just going to direct you to the multiple other people who have already effectively disproven this argument and similar arguments elsewhere in this post's discusion. Enjoy.

Also if you'd like to see how the corn dog comment is absurd and wrong. Go look up my comment.

I must admit, amount of comments that are defending AI images as not child porn is truly shocking.

In my book, sexual images of children are not okay, AI generated or otherwise. Pedophiles need help, counseling and therapy. Not images that enable something I think is not acceptable in society.

I truly do believe that AI images should be subject to same standards as regular images in what content we deem appropriate or not.

Yes, this can be used to wrongfully prosecute innocent people, but it does not mean that we should freely allow AI-CP.

I generally think if something is not causing harm to others, it shouldn't be illegal. I don't know if "generated" CSAM causes harm to others though. I looked it up and it appears the research on whether CSAM consumption increases the likelihood of a person committing child abuse is inconclusive.

Agreed, especially considering it will eventually become indistinguishable.

You’re not kidding.

The only possible way I could see a defense if it were something like “AI CSAM results in a proven reduction of actual CSAM”.

But. The defenses aren’t even that!

They’re literally saying that CSAM is okay. I’m guessing a lot of these same comments would argue that deepfakes are okay as well. Just a completely fucked up perspective.

Cant speak for others but I agree that AI-CP should be illegal.

The question is how do we define the crime with our current laws? It does seem like we need a new law to address AI images. Both for things like AI-CP, revenge porn, and slanderous/misleading photos. (The Communist Harris and Trump with black people photos)

Where do we draw the line?
How do we regulate it?
Forced watermarks/labels on all tools?
Jail time? Fines?
Forced correction notices? (Doesn't work for the news!)

This is all a slippery slope but what I can say is I hope this goes to court. He looses. Appeals. Then it goes all the way up to federal so we can have a standard to point to.

The shit wrong.
Step one in fixing shit.

Iirc he was prosecuted under a very broad "obscenity" law, which should terrify everyone.

Pedophiles need help, counseling and therapy. Not images that enable something I think is not acceptable in society.

I mean 30-40 years ago you could replace the word pedophile with homosexual and a vast majority of people would agree. I'm not defending pedophilia here but it's important to remember these people are born the way they are. Nothing is going to change that, new pedophiles are born every day. They will never go away. The same way you can't change gay or transgender people. Repressing sexual desire never works look at the priests in the Catholic Church. A healthy outlet such as AI generated porn could save a lot of actual children from harm. I think that should be looked into.

Look, I get what you are saying and I do agree. However, I don't think that comparing pedophilic relations to LGBTQ struggles is fair. One is consented relationship between consenting adults, other is exploitation and high probability of setup for lifelong mental struggles from young age.

I would like to know what source you have for claiming that pedophiles are "born the way they are."

We understand some of the genetic and intrauterine developmental reasons for homosexuality, being trans, etc. That has scientific backing, and our understanding continues to grow and expand.

Lumping child predators in with consenting adults smacks of the evangelical slippery slope argument against all forms of what they consider to be "sexual deviance." I'm not buying it.

the number of people willing to bat for this on Lemmy is truly disturbing. what do they think these ai models are trained on?

No necessarily is trained on CP, could be trained with images of children (already fuck up, who gave them that permission?) and pornography.

The article pointed out that stable diffusion was trained using a dataset containing CSAM

Arrested but for what crime? I absolutely agree that AI generated cp should be banned however it's not currently.

Even unrealistic depictions of children in a sexual context is considered CSAM.

In the United States, the U.S. Supreme Court has defined child pornography as material that "visually depicts sexual conduct by children below a specified age"

See New York v Ferber for more details

Iirc it was for Florida "obscenity" laws, which covers literally anything the state government finds objectionable.

I looked into this awhile back during a similar discussion.

Creating the content itself is not necessarily illegal if its ficticious.

Transmitting it over a public medium is very illegal, though.

So if you create your own ficticious child porn locally and it never gets transmitted over the internet, you'd maybe be okay legally speaking from a federal perspective anyway.

This guy transmitted it and broke the law.

If no children were involved in the production of porn, how is it pedophilic? That's like claiming a picture of water has the same properties as water.

However, a picture of water makes me thirsty. But then again, there is no substitute for water.

I am not defending pedos, or defending Florida for doing something like that.

That might be a you thing. Pictures of water dont make me thirsty. I get the metaphor you are attempting to make though.

It's pedophillic because it's sexual images of children; fake or not does not change that. Drawn images of child pornography are still pedophillic.

The more important question is, is it CSAM? Whether drawn images that represent no real child are or not depends on the legal jurisdiction. Should drawn and AI generated child porn be legal or banned? I think the actual answer to that would require significant research into whether its existence acts as an outlet to prevent pedophiles from harming actual children, or whether it encourages their proclivities and makes them more likely to hurt actual children.

Preventing harm to children should be the goal, but actual research of the effects of simulated child porn vis-a-vis pedophiles harming actual children is as unlikely to happen as any other research into pedophilia

Good. I do not think society owes pedos a legal means to create CSAM.

Images of crimes should be illegal. No one should be able to draw a murder.

It's already illegal to do basically most of the things leading up to a murder. You're not allowed to conspire to commit one, stalk your target, break into a building, torture animals, kidnap, hire a hitman, etc.

TV and movies should not be able to show crimes, because images depicting crimes should be illegal.

(I’m just illustrating the slippery slope criminalizing “artistic” renderings)

I'm not advocating for what you're saying here at all.

So there you go, your slope now has gravel on it.

EDIT: This dude was arrested using today's laws, and I'm pretty sure the series Dexter is still legal to write, direct, film, and view. So your slippery slope is a fallacious one (as most of them tend to be in my experience).

So artistic images of things that would be a crime in real life should be legal?

Sure, with some exceptions and reasonable definitions.

So, murder images should probably be banned right? One of those exceptions.

8 more...
8 more...
8 more...

Why should this be illegal?

Because it's illegal.

It should be illegal for a number of reasons. One is a simple practical one: as the technology advances towards increasing levels of realism it'll become impossible for law enforcement to determine what material is "regular" CSAM versus what material is "generated" CSAM.

So, unless you're looking to repeal all laws against possession of CSAM, you'll have a difficult time crafting a cut-out for generated CSAM.

And honestly, why bother? What's the upside here? To have pedos get a more fulfilling wank to appease them and hope they won't come after your kids for real? I really doubt the premise behind that one.

And honestly, why bother? What’s the upside here?

Allowing for victimless crimes simply because a group is undesirable is a terrible precedent. We can't ban things just because they make us uncomfortable. Or because it makes law enforcements job easier.

Allowing for victimless crimes simply because a group is undesirable is a terrible precedent.

I wouldn't even call it victimless, and we have all kinds of actual victimless crimes that are already illegal so I don't care about supposedly setting this "precedent" that has already been set a million times over.

and we have all kinds of actual victimless crimes that are already illegal

And we should be undoing those laws

We aren't though, so it's frankly pretty odd that you're fixated on this one.

It's frankly pretty odd that Lemmy in general seems to be fairly pro-generated-CSAM. I'm betting you guys are just afraid of the feds finding your stashes.

EDIT: I basically get maybe three replies a week to things I post on here, except when I post something about being okay with generated CSAM or deepfake porn being illegal (in which case I get binders full of creeps in my inbox).

There it is. "If you disagree with me you're a pedo". You people always go back to that. I bet I could convince you to hack off your own arm as long as I said anyone with a left arm is a pedo.

4 more...
4 more...
4 more...
4 more...
4 more...
4 more...
4 more...
12 more...
12 more...

But you are allowed to stage a murder; hire a film crew to record your staged murder; pay television stations and websites to show a preview of your staged murder, and sell a recording of your staged murder to anyone who wants to buy. Depending on how graphic it is, you might be required to put an age advisory on your work and not broadcast it on public airwaves; but that is about the extent of the regulation.

You can even stage murders of children and show that.

Even if the underlying murder is real, there is still no law outlawing having a recording of it. Even producing a recording of a murder isn't technically illegal; although depending on the context you might still be implicated in a conspiracy to commit murder.

Sexual assult of children is the only crime for which the mere depiction of the crime is itself a crime.

Sexual assult of children is the only crime for which the mere depiction of the crime is itself a crime.

Okay. Why should I care that this is an exception case?

Laws aren't derived from clean general principles by the gods on Mt. Olympus.

Another thing that you "can't we think of the poor, helpless pedos that want to nut?" people don't seem to think of is that if AI CSAM grows increasingly more realistic and we carve out an exception for it, how can you enforce laws against non-generated CSAM? You'd have to have some computer forensics asshole involved in every case to prove whether or not the images are generated, which would likely produce a chilling effect over time on regular CSAM cases, and all of this for the "societal benefit" of allowing pedos a more gratifying wank.

For the societal benefit of not ceding our rights to government.

Today it's pedophilia, but what about if Trump wins and legally designates trans people as pedophiles?

This is a power we cannot allow the government to have.

For the societal benefit of not ceding our rights to government.

The right to create realistic looking CSAM is not a right I give a shit about having.

Today it’s pedophilia, but what about if Trump wins and legally designates trans people as pedophiles?

What if a lawless dictator does crazy things? I'm not sure the law (or lack thereof) will have anything to do with that scenario.

The whole idea of "if X is illegal, then the government will use the law against X against us" only matters in the context of a government following the law.

If they're willing to color outside the lines and the courts say the law doesn't apply to them then it doesn't matter one fucking bit.

I'm not suggesting they'll abandon the rule of law. I'm suggesting they'll use this as precedent to legally oppress us.

And I'm saying that they won't have to.

Which is a terrible excuse for allowing bad laws. Sure, ok, let's assume you're right and this specific group will break the law anyway. Maybe let's prepare for other tyrannical groups?

Other tyrannical groups that also won't follow the law? I don't think laws will "prepare" for those either.

3 more...
3 more...
3 more...
3 more...
3 more...
3 more...
3 more...
3 more...
3 more...
15 more...
15 more...
15 more...

Pretty sure the training data sets are CSAM.

Edit, to those downvoting me and not reading the article:

A 2023 study from Stanford University also revealed that hundreds of child sex abuse images were found in widely-used generative AI image data sets.

"The content that we’ve seen, we believe is actually being generated using open source software, which has been downloaded and run locally on people’s computers and then modified," Internet Watch Foundation chief technology officer Dan Sexton told The Guardian last year. "And that is a much harder problem to fix."

I would imagine that AI having been trained on both pictures of kids and on adult sexual content would be somewhat enough to mix the two. Even if the output might end up uncanny.

That's the most likely case, now my question is was he using somebody else's generator or did he train this one himself

They mentioned it looks to be a local model he trained.

One doesn't need to browse AI generated images for longer than 5 seconds to realize it can generate a ton of stuff that you for absolute certainty can know wasn't on the training data. I don't get why people insist on the narrative that it can only output copies of what it has already seen. What's generative about that?

If you took a minute to read the article:

A 2023 study from Stanford University also revealed that hundreds of child sex abuse images were found in widely-used generative AI image data sets.

"The content that we’ve seen, we believe is actually being generated using open source software, which has been downloaded and run locally on people’s computers and then modified," Internet Watch Foundation chief technology officer Dan Sexton told The Guardian last year. "And that is a much harder problem to fix."

So not only do the online models have CSAM, but people are downloading open source software and I'd be very surprised if they weren't feeding it CSAM

That doesn't dispute my argument; generative AI can create images that are not in the training data. It doesn't need to know what something looks like as long as the person using it does and can write the correct prompt for it. The corn dog I posted below is a good example. You can be sure that wasn't in the training data yet it was still able to generate it.

Online models since that discovery have scrubbed the offending sources and retrained, as well as added safeguards to their models to try and prevent it.

If that's the basis for making it illegal, then all AI is illegal.

Which...eh maybe that's not such a bad idea

Since that study, every legit AI model has removed said images from their datasets and all models trained afterwards no longer include knowledge about those source images.

I know one AI model has specifically not included photos of underage people at all, to minimize the possibility this can happen even on accident. Making CSAM from an AI model is something anyone determined and patient enough can do with a good model trainer and a dataset of source images that have the features they want, even if the underage images are completely clean.

Making CSAM with an AI model is a deliberate act in almost every case... and in this case, he was arrested for distributing these images, which is super illegal for obvious reasons.

Honestly, I don't care if it is AI/not real, I'm glad that the man was arrested. He needs some serious help for sexualising kids.

You and I both know he's not going to get it. I have a kinda sympathy for ppl attracted to kids but refuse to act on it. They clearly know it's not normal and recognize the absolute life destroying damage they can cause if they act on it. That being said there's not many places you can go to seek treatment. Any institutions that advertised treatment would have ppl outside with pitchforks and torches.

Before anyone tries to claim I'm pro pedo you can fuck right off. I just wish it was possible for ppl who are attracted to kids and not out touching them to get some kind of therapy and medication to make them normal (or destroy their sex drive) before something terrible happens.

to get some kind of therapy and medication to make them normal

Hi, Psychologist here. Does society have strong evidence that therapeutic interventions are reducing rates of, say, the most common disorders of anxiety and depression? Considering that the rates of these are going up, I don't think we can assume there's a hugely successful therapy to help those attracted to CSA images to change. Psychology is not a very good science principally because it offers few extremely effective answers in the real world.

In terms of medication androgen antagonists are generally used. This is because lowering testosterone generally leads to a lower sex drive. Here is an article about those drugs, including an offender who asked for them: https://www.theguardian.com/society/2016/mar/01/what-should-we-do-about-paedophiles

TW: the article contains discussion of whether offenders are even psychologically disordered, when set within a historical cultural context of child-marriage. This paragraph is two above the illustration of people trapped within concentric circular walls, and starts "In the 2013 edition ...".

Collis began to research the treatment and decided that it was essential to his rehabilitation. He believes he was born a paedophile, and that his attraction to children is unchangeable. “I did NOT wake up one morning and decide my sexual preference. I am sexually attracted to little girls and have absolutely no interest in sex with adults. I’ve only ever done stuff with adults in order to fit in with what’s ‘normal’.” For Collis, therefore, it became a question of how to control this desire and render himself incapable of reoffending.

[...]

Many experts support Aaron Collis’s self-assessment, that paedophilia is an unchangeable sexual preference. In a 2012 paper, Seto examined three criteria – age of onset, sexual and romantic behaviour, and stability over time. In a number of studies, a significant proportion of paedophiles admitted to first experiencing attraction towards children before they had reached adulthood themselves. Many described their feelings for children as being driven by emotional need as well as sexual desire. As for stability over time, most clinicians agreed that paedophilia had “a lifelong course”: a true paedophile will always be attracted to children. “I am certainly of the view,” Seto told me, “that paedophilia can be thought of as a sexual orientation.”

Brain-imaging studies have supported this idea. James Cantor, a psychiatry professor at the University of Toronto, has examined hundreds of MRI scans of the brains of paedophiles, and found that they are statistically more likely to be left-handed, shorter than average, and have a significantly lower density of white matter, the brain’s connective tissue. “The point that’s important for society is that paedophilia is in the brain at all, and that the person didn’t choose it,” Cantor told me. “As far as we can tell, they were born with it.” (Not that this, he emphasised, should excuse their crimes.)

[...]

Clinical reality is a little more complicated. “There’s no pretence that the treatment is somehow going to cure them of paedophilia,” Grubin told me. “I think there is an acceptance now that you are not going to be able to change very easily the direction of someone’s arousal.” Grubin estimates that medication is only suitable for about 5% of sex offenders – those who are sexually preoccupied to the extent that they cannot think about anything else, and are not able to control their sexual urges. As Sarah Skett from the NHS put it: “The meds only take you so far. The evidence is clear that the best treatment for sex offending is psychologically based. What the medication does is help people have a little bit of control, which then allows them to access that treatment.”

 

Some research on success rates:

Prematurely terminating treatment was a strong indicator of committing a new sexual offense. Of interest was the general improvement of success rates over each successive 5-year period for many types of offenders. Unfortunately, failure rates remained comparatively high for rapists (20%) and homosexual pedophiles (16%), regardless of when they were treated over the 25-year period. [https://pubmed.ncbi.nlm.nih.gov/11961909/]

Within the observation period, the general recidivism and sexual recidivism rates were 33.1% and 16.5%, respectively, and the sexual contact recidivism rate was 4.7%. [https://journals.sagepub.com/doi/abs/10.1177/0306624X231165416 - this paper says that suppressing the sex drive with medication was the most successful treatment]

Men with deviant sexual behavior, or paraphilia, are usually treated with psychotherapy, antidepressant drugs, progestins, and antiandrogens, but these treatments are often ineffective. Selective inhibition of pituitary–gonadal function with a long-acting agonist analogue of gonadotropin-releasing hormone may abolish the deviant sexual behavior by reducing testosterone secretion. [https://www.nejm.org/doi/full/10.1056/nejm199802123380702 - this paper supports that lowering testosterone works best]

Thank you for such a well laid out response and the research to back it up. I rarely see people approaching the subjects of pedophilia, and how best to treat pedophiles, rationally and analytically.

It's understandable considering the harm they can cause to society that most can only ever view them as nothing more or less than monsters, and indeed, those that are incapable of comprehending the harm they cause and empathizing with those they could potentially cause or have caused harm to, are IMHO some of the more loathsome individuals.

That said, I think too often people are willing to paint others whose proclivities are so alien and antithetical to our own as not only monsters, but monsters that aren't worth understanding with any degree of nuance, that we ultimately do ourselves and future generations a disservice by not at least attempting to address the issue at hand in the hopes that the most harmful parts of our collective psyche are treated palliatively to the best of our ability.

Your annotated sources indicate that there is not nearly as clear a path forward as detractors to the "pedophiles are simply monsters and there's no reason to look into their motives further" would like to believe, while also, by the nature of the existence of the attempted treatments themselves, points out that there is more work to be done to hopefully find a more lasting and successful rate of treatment.

Like many of the psychological ailments plagueing societies today, you cannot simply kill and imprison the problem away. That is always a short term (albeit at times temporarily effective) solution. The solution to the problem of how to greatly reduce the occurrence of pedophilia will ultimately require more of this kind of research and will require more analysis and study towards achieving such ends.

Again, I thank you for your nuanced post, and commend you for taking your nuanced stance as well.

Isn't it more about power than sex many times anyway?

For people actually abusing? Spot on, most of the time.

For non-offending paedos? Nah… a horrible affliction.

Not always. There are people with brain injuries who suddenly develop an attraction towards kids and it's not really due to power dynamics or anything else.

I don't understand why we haven't used inhalable oxytocin as an experimental drug for people attracted to children and animals. It seems intuitive- children and animals generate oxytocin for humans automatically, and it's possible some people need a stronger stimulus to release oxytocin or may not have a lot of oxytocin endogenously. Oxytocin can be compounded at a pharmacy and has been used successfully for social anxiety.

1 more...

And that guy gets that help in a prison, riiight.

Depending on the state, yes actually.

I did time in a medium security facility that also did sex offender treatment (I was there on drug charges). I still have friends that went through that program.

The men who were actually invested in getting better, got better. The ones invested in staying well, are still well.

Chemical castration offers the best success rates, see my comment under OP citing research.

I don't think that's actually help him.

You don't think that reducing testosterone and therefore sex drive will change offending rates? That is contrary to research which has reliably found that this is the best therapy, in terms of effectiveness on recidivism.

That guy didn't even commit anything just having AI imagery depicting children.

That guy has a mental problem that you can't only treat by chemical castration. He needs more than that.

That does not change the fact that chemical castration is the most successful treatment we have to stop CSA recidivism at present.

 

That guy didn't even commit anything just having AI imagery depicting children.

Possessing and distributing images that sexually objectify children may be a crime, even if generated by AI.

Cutting off their testicles and straight up executing them would also reduce the offending rates. Even more effectively than chemical castration, I'm sure. But we wouldn't be calling that helping the offender, would we? And the comment above was specifically talking about helping them.
What we have now is more of a best middle ground between the amount of damage caused to the patient and safety guarantees for the society. We obviously prioritize safety for the society, but we should be striving for less damage to the patient, too.

...we should be striving for less damage to the patient, too.

Can you make someone just not sexually interested in something they find arousing? As far as I know, conversion therapy for non-heterosexual people doesn't have good success rates. Also, those therapies also tended to involve some form of harm, from what I've heard.

It's not about making someone want something, less, but helping them to never act on those needs.

Computer generated imagery could in theory be helpful, so the itch gets scratched without creating victims and criminals.

I'd call that a win-win in terms of societal well being, as also less funds are wasted on police work, jailing a perpetrator, and therapy for victim.

Computer generated imagery could in theory be helpful, so the itch gets scratched

The Freudian concept of catharsis has been debunked many, many times. Does rape go down when freely available internet porn go up? That's a rhetorical question, goodbye.

It hasn't been debunked? I've read that raping did go down when porn became readily available. If that's not the case, then it's a whole another story.

Can you make someone just not sexually interested in something they find arousing?

No, I can't. Doesn't mean that we (as a society) shouldn't be working on finding ways to do it or finding alternative solutions. And it's necessary to acknowledge that what we have now is not good enough.

those therapies also tended to involve some form of harm

They probably did. But nobody here is claiming those were good or helping the patients either.

1 more...