“CSAM generated by AI is still CSAM,” DOJ says after rare arrest

jeffw@lemmy.worldmod to News@lemmy.world – 291 points –
“CSAM generated by AI is still CSAM,” DOJ says after rare arrest
arstechnica.com
216

You are viewing a single comment

I think it would just create adults naked with children's faces unless it actually had CSAM... Which it probably does have.

Again, that's not how it works.

Could you hypothetically describe csam without describing an adult with a child's head, or specifying that it's a naked child?
That's what a person trying to generate csam would need to do, because it doesn't have those concepts.
If you just asked it directly, like I said "horse flying a hangglider" before, you would get what you describe because it's using the only "naked" it knows.
You would need to specifically ask it to demphasize adult characteristics and emphasize child characteristics.

That doesn't mean that it was trained on that content.

For context from the article:

The DOJ alleged that evidence from his laptop showed that Anderegg "used extremely specific and explicit prompts to create these images," including "specific 'negative' prompts—that is, prompts that direct the GenAI model on what not to include in generated content—to avoid creating images that depict adults."

6 more...
6 more...