"A Billion Nazis at the Table" - The Fediverse model proves contextual moderation by real humans is both easy and affordable. The presence of Nazis on corporate social media implies at least a tacit a

JustinHanagan@kbin.social to Technology@lemmy.world – 590 points –
A Billion Nazis at the Table
staygrounded.online

"If you’ve ever hosted a potluck and none of the guests were spouting antisemitic and/or authoritarian talking points, congratulations! You’ve achieved what some of the most valuable companies in the world claim is impossible."

84

You are viewing a single comment

If a Fediverse instance grew so big that it couldn't moderate itself and had a lot of spam/Nazis, presumably other instances would just defederate, yeah? Unless an instance is ad-supported, what's the incentive to grow beyond one's ability to stay under control?

deleted

questionable pictures

We need to keep distinguishing "actual, real-life child-abuse material" from "weird/icky porn". Fediverse services have been used to distribute both, but they represent really different classes of problem.

Real-life CSAM is illegal to possess. If someone posts it on an instance you own, you have a legal problem. It is an actual real-life threat to your freedom and the freedom of your other users.

Weird/icky porn is not typically illegal, but it's something many people don't want to support or be associated with. Instance owners have a right to say "I don't want my instance used to host weird/icky porn." Other instance owners can say "I quite like the porn that you find weird/icky, please post it over here!"

Real-life CSAM is not just extremely weird/icky porn. It is a whole different level of problem, because it is a live threat to anyone who gets it on their computer.

5 more...
5 more...
5 more...