Lemmyshitpost community closed until further notice

lwadmin@lemmy.worldmod to Lemmy.World Announcements@lemmy.world – 1738 points –

Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won't help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn't his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what's next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It's been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn't the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

695

You are viewing a single comment

troll is too mild of an adjective for these people

How about "pedophile"? I mean, they had to have the images to post them.

"Terrorist". Having the images doesn't mean they liked them, they used them to terrorize a whole community though.

"Petophilile enabled Terrorist" or "petophilic terrorist" depending on the person

It still means they can tolerate CSAM or are normalized to it enough that they can feel anything other than discust during "shipping and handling".

All of your comments have "banned" next to them for me (logged in via lemmy.world using Liftoff) - any idea why?

I assume you're not actually banned...?

They were banned because they were defending pedophilia (advocating for them to be able to get off to what turns them on) and also trolling very aggressively. You can look at them in the Modlog on the Website, not sure if Apps implement the modlog yet though.

Ah thanks, I've seen it a few times but thought it was a bug. Why are people like this!

I have not clue, people can be quite toxic and horrible. Also noticed that they reduced his ban, not sure why, defending pedophilia is pretty bad and definitely carries legal risk but it's not my call to make.

Yeah, got banned for forgetting that some axioms give people free pass to say whatever they want, no matter how they say it... and replying in kind is forbidden. My bad.

You were banned because you were arguing for why people shouldn't be arrested for possession of CSAM material, trolling and straw-manning in the replies, and on top of that attempting to seriously and honestly advocate for pedophiles on another community, which is at best borderline illegal (anyone can check the modlog on that one if they don't believe me, I wouldn't make such claims if they weren't true).

So to summerize you were banned for

  • Trolling
  • Promoting Illegal Activities (Pedophilla and CSAM)

Yeah, this isn’t just joking or shitposting. This is the kind of shit that gets people locked up in federal pound-you-in-the-ass prison for decades. The feds don’t care if you sought out the CSAM, because it still exists on your device regardless of intent.

The laws about possessing CSAM are written in a way that any plausible deniability is removed, specifically to prevent pedophiles from being able to go “oh lol a buddy sent that to me as a joke” and getting acquitted. The courts don’t care why you have CSAM on your server. All they care about is the fact that you do. And since you own the server, you own the CSAM and they’ll prosecute you for it.

And not just the instance admins would be at risk as well. Any time you view an image your device is making a local copy of it. Meaning every person who viewed the image even accidentally is at risk as well.

Yeah honestly report all of those accounts to law enforcement. It’s unlikely they’d be able to do much, I assume, but these people are literally distributing CSAM.