Stanford researchers find Mastodon has a massive child abuse material problem

corb3t@lemmy.world to Technology@lemmy.ml – 193 points –
Stanford researchers find Mastodon has a massive child abuse material problem
theverge.com

Not a good look for Mastodon - what can be done to automate the removal of CSAM?

232

You are viewing a single comment

I don't see what a server admin can do about it other than defederate the instant they get reports. Otherwise how can they possibly know?

This could be a really big issue though. People can make instances for really hateful and disgusting crap but even if everyone defederates from them it's still giving them a platform, a tiny tiny corner on the internet to talk about truly horrible topics.

Again if it's illegal content publically available, officials can charge those site admins with crime of hosting. Everyone just has a duty to defederate.

Those corners will exist no matter what service they use and there is nothing Mastodon can do to stop this. There's a reason there are public lists of instances to defederate. This content can only be prevented by domain providers and governments.