Stanford researchers find Mastodon has a massive child abuse material problem

corb3t@lemmy.world to Technology@lemmy.ml – 193 points –
Stanford researchers find Mastodon has a massive child abuse material problem
theverge.com

Not a good look for Mastodon - what can be done to automate the removal of CSAM?

232

You are viewing a single comment

@Arotrios @corb3t

They want to intimidate you with #ForTheChildren

Sounds like they succeeded

Nah, not intimidated. More that I ran a sizeable forum in the past and I know what what a pain in the ass this kind of content can be to deal with. That's why I was asking about automated tools to deal with it. The forum I ran got targeted by a bunch of Turkish hackers, and their one of their attack techniques involved a wave of spambot accounts trying to post crap content. I wasn't intimidated (fought them for about two years straight), but by the end of it I was exhausted to the point where it just wasn't worth it anymore. An automated CSAM filter would have made a huge difference, but this was over a decade ago and those tools weren't around.

@Arotrios @corb3t

Totally reasonable. If (when) I create my own instance it will be very locked down re who I allow to join

Not sure why you’re continually @ replying to me? Is discussion around activitypub content moderation an issue for you?