"A Billion Nazis at the Table" - The Fediverse model proves contextual moderation by real humans is both easy and affordable. The presence of Nazis on corporate social media implies at least a tacit a
staygrounded.online
"If you’ve ever hosted a potluck and none of the guests were spouting antisemitic and/or authoritarian talking points, congratulations! You’ve achieved what some of the most valuable companies in the world claim is impossible."
You are viewing a single comment
We need to keep distinguishing "actual, real-life child-abuse material" from "weird/icky porn". Fediverse services have been used to distribute both, but they represent really different classes of problem.
Real-life CSAM is illegal to possess. If someone posts it on an instance you own, you have a legal problem. It is an actual real-life threat to your freedom and the freedom of your other users.
Weird/icky porn is not typically illegal, but it's something many people don't want to support or be associated with. Instance owners have a right to say "I don't want my instance used to host weird/icky porn." Other instance owners can say "I quite like the porn that you find weird/icky, please post it over here!"
Real-life CSAM is not just extremely weird/icky porn. It is a whole different level of problem, because it is a live threat to anyone who gets it on their computer.