FBI Arrests Man For Generating AI Child Sexual Abuse Imagery

misk@sopuli.xyz to Technology@lemmy.world – 505 points –
FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
404media.co
277

You are viewing a single comment

It's worth mentioning that in this instance the guy did send porn to a minor. This isn't exactly a cut and dry, "guy used stable diffusion wrong" case. He was distributing it and grooming a kid.

The major concern to me, is that there isn't really any guidance from the FBI on what you can and can't do, which may lead to some big issues.

For example, websites like novelai make a business out of providing pornographic, anime-style image generation. The models they use deliberately tuned to provide abstract, "artistic" styles, but they can generate semi realistic images.

Now, let's say a criminal group uses novelai to produce CSAM of real people via the inpainting tools. Let's say the FBI cast a wide net and begins surveillance of novelai's userbase.

Is every person who goes on there and types, "Loli" or "Anya from spy x family, realistic, NSFW" (that's an underaged character) going to get a letter in the mail from the FBI? I feel like it's within the realm of possibility. What about "teen girls gone wild, NSFW?" Or "young man, no facial body hair, naked, NSFW?"

This is NOT a good scenario, imo. The systems used to produce harmful images being the same systems used to produce benign or borderline images. It's a dangerous mix, and throws the whole enterprise into question.

The major concern to me, is that there isn't really any guidance from the FBI on what you can and can't do, which may lead to some big issues.

https://www.ic3.gov/Media/Y2024/PSA240329 https://www.justice.gov/criminal/criminal-ceos/citizens-guide-us-federal-law-child-pornography

They've actually issued warnings and guidance, and the law itself is pretty concise regarding what's allowed.

(8) "child pornography" means any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture, whether made or produced by electronic, mechanical, or other means, of sexually explicit conduct, where-

(A) the production of such visual depiction involves the use of a minor engaging in sexually explicit conduct;

(B) such visual depiction is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct; or

(C) such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.

...

(11) the term "indistinguishable" used with respect to a depiction, means virtually indistinguishable, in that the depiction is such that an ordinary person viewing the depiction would conclude that the depiction is of an actual minor engaged in sexually explicit conduct. This definition does not apply to depictions that are drawings, cartoons, sculptures, or paintings depicting minors or adults.

https://uscode.house.gov/view.xhtml?hl=false&edition=prelim&req=granuleid%3AUSC-prelim-title18-section2256&f=treesort&num=0

If you're going to be doing grey area things you should do more than the five minutes of searching I did to find those honestly.

It was basically born out of a supreme Court case in the early 2000s regarding an earlier version of the law that went much further and banned anything that "appeared to be" or "was presented as" sexual content involving minors, regardless of context, and could have plausibly been used against young looking adult models, artistically significant paintings, or things like Romeo and Juliet, which are neither explicit nor vulgar but could be presented as involving child sexual activity. (Juliet's 14 and it's clearly labeled as a love story).
After the relevant provisions were struck down, a new law was passed that factored in the justices rationale and commentary about what would be acceptable and gave us our current system of "it has to have some redeeming value, or not involve actual children and plausibly not look like it involves actual children".

The major concern to me, is that there isn’t really any guidance from the FBI on what you can and can’t do, which may lead to some big issues.

The Protect Act of 2003 means that any artistic depiction of CSAM is illegal. The guidance is pretty clear, FBI is gonna raid your house.....eventually. We still haven't properly funded the anti-CSAM departments.

Is every person who goes on there and types, "Loli" or "Anya from spy x family, realistic, NSFW" (that's an underaged character) going to get a letter in the mail from the FBI?

I'll throw that baby out with the bathwater to be honest.

Simulated crimes aren't crimes. Would you arrest every couple that finds health ways to simulate rape fetishes? Would you arrest every person who watches Fast and The Furious or The Godfather?

If no one is being hurt, if no real CSAM is being fed into the model, if no pornographic images are being sent to minors, it shouldn't be a crime. Just because it makes you uncomfortable, don't make it immoral.

Or, ya know, everyone who ever wanted to decapitate those stupid fucking Skyrim children. Crime requires damaged parties, and with this (idealized case, not the specific one in the article) there is none.

Those were demon children from hell (with like 2 exceptions maybe). It was a crime by Bethesda to make them invulnerable / protected by default.

Simulated crimes aren't crimes.

If they were, any one who's played games is fucked. I'm confident everyone who has played went on a total ramapage murdering the townfolk, pillaging their houses and blowing everything up....in Minecraft.

They would though. We know they would because conservatives already did the whole laws about how you can have sex in private thing.

Simulated crimes aren’t crimes.

Artistic CSAM is definitely a crime in the United States. PROTECT act of 2003.

People have only gotten in trouble for that when they're already in trouble for real CSAM. I'm not terrible interested in sticking up for actual CSAM scum.

15 more...
16 more...
16 more...