Stanford researchers find Mastodon has a massive child abuse material problem
![](https://lemmy.ml/pictrs/image/2QNz7bkA1V.png)
![Stanford researchers find Mastodon has a massive child abuse material problem](https://forkk.me/pictrs/image/5c53a05d-5a60-44fc-a3d5-77bef1f2aa0c.jpeg?format=jpg&thumbnail=256)
theverge.com
Not a good look for Mastodon - what can be done to automate the removal of CSAM?
You are viewing a single comment
Not a good look for Mastodon - what can be done to automate the removal of CSAM?
@Arotrios @corb3t
They want to intimidate you with #ForTheChildren
Sounds like they succeeded
Nah, not intimidated. More that I ran a sizeable forum in the past and I know what what a pain in the ass this kind of content can be to deal with. That's why I was asking about automated tools to deal with it. The forum I ran got targeted by a bunch of Turkish hackers, and their one of their attack techniques involved a wave of spambot accounts trying to post crap content. I wasn't intimidated (fought them for about two years straight), but by the end of it I was exhausted to the point where it just wasn't worth it anymore. An automated CSAM filter would have made a huge difference, but this was over a decade ago and those tools weren't around.
@Arotrios @corb3t
Totally reasonable. If (when) I create my own instance it will be very locked down re who I allow to join
Not sure why you’re continually @ replying to me? Is discussion around activitypub content moderation an issue for you?