Ah, Yes! AI Will Surely Save Us All!Godric@lemmy.world to Lemmy Shitpost@lemmy.world – 801 points – 5 months ago86Post a CommentPreviewYou are viewing a single commentView all commentsShow the parent commentHow can text ever possibly be CSAM when there's no child or sexual abuse involved?Text even completely fictional can be CSAM based on jurisdiction.I've seen no evidence to that. There are cases tried under obscenity laws but CSAM has a pretty clear definition of being visual.Internationally? I know that in Germany there are cases.I didn't say anything about text?What exactly do you think erotic roleplay means?Well, I honestly hadn't considered someone texting with a LLM, I was more thinking about AI generated images.
How can text ever possibly be CSAM when there's no child or sexual abuse involved?Text even completely fictional can be CSAM based on jurisdiction.I've seen no evidence to that. There are cases tried under obscenity laws but CSAM has a pretty clear definition of being visual.Internationally? I know that in Germany there are cases.I didn't say anything about text?What exactly do you think erotic roleplay means?Well, I honestly hadn't considered someone texting with a LLM, I was more thinking about AI generated images.
Text even completely fictional can be CSAM based on jurisdiction.I've seen no evidence to that. There are cases tried under obscenity laws but CSAM has a pretty clear definition of being visual.Internationally? I know that in Germany there are cases.
I've seen no evidence to that. There are cases tried under obscenity laws but CSAM has a pretty clear definition of being visual.Internationally? I know that in Germany there are cases.
I didn't say anything about text?What exactly do you think erotic roleplay means?Well, I honestly hadn't considered someone texting with a LLM, I was more thinking about AI generated images.
What exactly do you think erotic roleplay means?Well, I honestly hadn't considered someone texting with a LLM, I was more thinking about AI generated images.
Well, I honestly hadn't considered someone texting with a LLM, I was more thinking about AI generated images.
How can text ever possibly be CSAM when there's no child or sexual abuse involved?
Text even completely fictional can be CSAM based on jurisdiction.
I've seen no evidence to that. There are cases tried under obscenity laws but CSAM has a pretty clear definition of being visual.
Internationally? I know that in Germany there are cases.
I didn't say anything about text?
What exactly do you think erotic roleplay means?
Well, I honestly hadn't considered someone texting with a LLM, I was more thinking about AI generated images.