Stanford researchers find Mastodon has a massive child abuse material problem
theverge.com
Not a good look for Mastodon - what can be done to automate the removal of CSAM?
You are viewing a single comment
Not a good look for Mastodon - what can be done to automate the removal of CSAM?
The actual report is probably better to read.
It points out that you upload to one server, and that server then sends the image to thousands of others. How do those thousands of others scan for this? In theory, using the PhotoDNA tool that large companies use, but then you have to send the every image to PhotoDNA thousands of times, once for each server (because how do you trust another server telling you it's fine?).
The report provides recommendations on how servers can use signatures and public keys to trust scan results from PhotoDNA, so images can be federated with a level of trust. It also suggests large players entering the market (Meta, Wordpress, etc) should collaborate to build these tools that all servers can use.
Basically the original report points out the ease of finding CSAM on mastodon, and addresses the challenges unique to federation including proposing solutions. It doesn't claim centralised servers have it solved, it just addresses additional challenges federation has.
Yeah I was mostly just complaining about the poor quality of mainstream tech articles. The original report is a much pretty read and brings up some great points.