Stanford researchers find Mastodon has a massive child abuse material problem

corb3t@lemmy.world to Technology@lemmy.ml – 193 points –
Stanford researchers find Mastodon has a massive child abuse material problem
theverge.com

Not a good look for Mastodon - what can be done to automate the removal of CSAM?

232

You are viewing a single comment

Seems odd that they mention Mastodon as a Twitter alternative in this article, but do not make any mention of the fact that Twitter is also rife with these problems, more so as they lose employees and therefore moderation capabilities. These problems have been around on Twitter for far longer, and not nearly enough has been done.

The actual report is probably better to read.

It points out that you upload to one server, and that server then sends the image to thousands of others. How do those thousands of others scan for this? In theory, using the PhotoDNA tool that large companies use, but then you have to send the every image to PhotoDNA thousands of times, once for each server (because how do you trust another server telling you it's fine?).

The report provides recommendations on how servers can use signatures and public keys to trust scan results from PhotoDNA, so images can be federated with a level of trust. It also suggests large players entering the market (Meta, Wordpress, etc) should collaborate to build these tools that all servers can use.

Basically the original report points out the ease of finding CSAM on mastodon, and addresses the challenges unique to federation including proposing solutions. It doesn't claim centralised servers have it solved, it just addresses additional challenges federation has.

Yeah I was mostly just complaining about the poor quality of mainstream tech articles. The original report is a much pretty read and brings up some great points.

1 more...
1 more...