It is true that removing and demonetising Nazi content wouldn't make the problem of Nazis go away. It would just be moved to dark corners of the internet where the majority of people would never find it, and its presence on dodgy-looking websites combined with its absence on major platforms would contribute to a general sense that being a Nazi isn't something that's accepted in wider society. Even without entirely making the problem go away, the problem is substantially reduced when it isn't normalised.
the weirdest thing to me is these guys always ignore that banning the freaks worked on Reddit--which is stereotypically the most cringe techno-libertarian platform of the lot--without ruining the right to say goofy shit on the platform. they banned a bunch of the reactionary subs and, spoiler, issues with those communities have been much lessened since that happened while still allowing for people to say patently wild, unpopular shit
Yep! Reddit is still pretty awful in many respects (and I only even bother with it for specific communities for which I haven't found a suitable active equivalent on Lemmy - more frogs and bugs on Lemmy please), but it did get notably less unpleasant when the majority of the truly terrible subs were banned. So it does make a difference.
I feel like "don't let perfect be the enemy of good" is apt when it comes to reactionaries and fascists. Completely eliminating hateful ideologies would be perfect, but limiting their reach is still good, and saying "removing their content doesn't make the problem go away" makes it sound like any effort to limit the harm they do is rendered meaningless because the outcome is merely good rather than perfect.
They took way too long unfortunately , but totally agree. thedonald, femaledatingstrategy and fatpeoplehate should have been banned a lot quicker
It feels like they've let it degrade again too now. Last I was on it, lots of subs had gone really toxic and weird
You're literally on a platform that was created to harbor extremist groups. Look at who Dessalines is, (aka u/parentis-shotgun) and their self-proclaimed motivation for writing LemmyNet. When you ban people from a website, they just move to another place, they are not stupid it's pretty easy to create websites. It's purely optical, you're not saving civilisation from harmful ideas, just preventing yourself from seeing it.
When you ban people from a website, they just move to another place, they are not stupid itâs pretty easy to create websites. Itâs purely optical,
you are literally describing an event that induces the sort of entropy we're talking about here--necessarily when you ban a community of Nazis or something and they have to go somewhere else, not everybody moves to the next place (and those people diffuse back into the general population), which has a deradicalizing effect on them overall because they're not just stewing in a cauldron of other people who reinforce their beliefs
"A deradicalising effect"
I'm sorry what? The idea that smaller communities are somehow less radical is absurd.
I think you are unaware (or much more likely willfully ignoring) that communities are primarily dominated by a few active users, and simply viewed with a varying degree of support by non-engaging users.
If they never valued communities enough to stay with them, then they never really cared about the cause to begin with. These aren't the radicals you need to be concerned about.
"And those people diffuse back into the general population"
Because that doesn't happen to a greater degree when exposed to the "general population" on the same website?
Iâm sorry what? The idea that smaller communities are somehow less radical is absurd.
i'd like you to quote where i said this--and i'm just going to ignore everything else you say here until you do, because it's not useful to have a discussion in which you completely misunderstand what i'm saying from the first sentence.
The deradicalizing effect occurs in the people who do not follow the fringe group to a new platform.
Many people lurk on Reddit who will see extremist content there and be influenced by it, but who do not align with the group posting it directly, and will not seek them out after their subreddit or posted content is banned.
Sure but what degree of influence is actually "radicalising" or a point of concern?
We like to pretend that by banning extreme communities we are saving civilisation from them. But the fact is that extreme groups are already rejected by society. If your ideas are not actually somewhat adjacent to already held beliefs, you can't just force people to accept them.
I think a good example of this was the "fall" of Richard Spencer. All the leftist communities (of which I was semi-active in at the time) credited his decline with the punch he received and apparently assumed that it was the act of punching that resulted in his decline, and used it to justify more violent actions. The reality is that Spencer just had a clique of friends that the left (and Spencer himself) interpreted as wide support and when he was punched the greater public didn't care because they never cared about him.
deradicalizing effect on them overall because they're not just stewing in a cauldron of other people who reinforce their beliefs
Whom are we talking about here, the ones who get kicked out and seek each other in a more concentrated form, or the ones who are left behind without the radicalizing agents?
I don't want to have to deal with Nazis, or several other sects, but I don't think forcing them into a smaller echo chamber is helping either.
Ideally, I think a social platform should lure radicalizing agents, then expose them to de-radicalizing ones, without exposing everyone else. Might be a hard task to achieve, but worth it.
Ideally, I think a social platform should lure radicalizing agents, then expose them to de-radicalizing ones, without exposing everyone else. Might be a hard task to achieve, but worth it.
You really think this works? I don't. I just see them souring the atmosphere for everyone and attracting more mainstream users to their views.
We've seen in Holland how this worked out. The nazi party leader (who chanted "Less Moroccans") won the elections by a landslide a month ago. There is a real danger of disenchanted mainstreamers being attracted to nazi propaganda in droves. We're stuck with them now for 4 years (unless they manage to collapse on their own, which I do hope).
No, that's why I said "Ideally", meaning it as a goal.
I don't think we have the means to do it yet, or at least I don't know of any platform working like that, but I have some ideas of how some of it could be done. Back in the days of Digg, with some people, we spitballed some ideas for social networks, among them a movie ranking one (that turned out to be a flop because different people would categorize films differently), and a kind of PageRank for social networks, that back then was computationally impractical. But with modern LLMs running trillions of parameters, and further hardware advances, even O(n²) with n=millions becomes feasible in real time, and in practice it wouldn't need to do nearly that much work. With the right tuning, and dynamic message visibility, I think something like that could create the exact echo chambers that would attract X people, allow in des-X people, while keeping everyone else out and unbothered.
Of course there is a dark side, in that a platform could use the same strategy to mold the opinion of any group... and I wouldn't be surprised to learn that Meta had been doing exactly that.
I'd argue that it still broke Reddit.
Back in the day, I might say something out of tone in some subreddit, get the comment flagged, discuss it with a mod, and either agree to edit it or get it removed. No problem.
Then Reddit started banning reactionary subs, subs started using bots to ban people for even commenting on other blacklisted subs, subs started abusing automod to ban people left and right, even quoting someone to criticize them started counting as using the same "forbidden words", conversations with mods to clear stuff up pretty much disappeared, application of modern ToS retroactively to 10 year old content became a thing... until I got permabanned from the whole site after trying to recur a ban, with zero human interaction. Some months later, while already banned sitewide, they also banned me from some more subs.
Recently Reddit revealed a "hidden karma" feature to let automod pre-moderate potentially disruptive users.
Issues with the communities may have lessened, but there is definitely no longer the ability to say goofy, wild, or unpopular stuff... or in some cases, even to criticize them. There also have been an unknown number of "collateral damage" bans, that Reddit doesn't care about anymore.
imo if reddit couldn't survive "purging literally its worst elements, which included some of the most vehement bigotry and abhorrent content outside of 4chan" it probably doesn't deserve to survive
I see it as a cautionary tale about relying too much on automated mod tools to deal with an overwhelming userbase. People make mistakes, simple tools make more.
@jarfil@alyaza i have said plenty of wild stuff and haven't been banned from any subs? None of it has been bigoted tho
The only time I got banned for bigoted stuff, was precisely for quoting someone's n-word and calling them out on it. Automod didn't care about the context, no human did either. Also got banned for getting carried away and making a joke in a "no jokes" (zero tolerance) sub. Several years following the rules didn't grant me even a second chance. Then was the funny time when someone made me a mod of a something-CCP sub, and automatically several other subs banned me.
There is a lot more going on Reddit than what meets the eye, and they like to keep it out of sight.
The only time I got banned for bigoted stuff, was precisely for quoting someoneâs n-word and calling them out on it. Automod didnât care about the context, no human did either.
It sounds like the right call was made (as long as both you and the OP were banned). As a white person, there is no reason for you to use the n-word. In that situation simply changing it to "n-word" is the very least that could have been done
I'm not really sure how that provides and example of stuff going on in the background that someone wants to keep out of sight.
The thing is I did not "use" it, just quoted their whole message. In hindsight, maybe I should have changed it, but I still find it a flaw to not take context into account.
It provides an example of context-less rules blindly applied by a machine, with no public accountability of what happened, much less of the now gone context.
There are many better ways of handling those cases, like flagging the comment with a content warning, maybe replacing the offensive words, or locking it for moderation, instead of disappearing everything. I didn't have half a chance of fixing things, had to use reveddit to just guess what I might've done wrong.
The thing is, no context would have made it OK. You may have just been quoting someone, but you still used the word in the quote. Quotes are not some uneditable thing, so it was your choice to leave it in. Zero tolerance for hate means repeating the hateful thing is also not tolerated, and that, IMO, is a good thing and the perfect use of an auto-mod.
The other examples are a bit nebulous, and I have no doubt that communities on reddit have esoteric moderation guidelines, but this particular example seems pretty cut and dry.
Quotes are not uneditable... but neither are comments.
Wouldn't be the first time when the parent gets edited to make a reply look like nonsense, so I got used to quoting as a countermeasure. Then they unlocked comment editing even in 10 year old "archived" posts 𤌠(BTW, the same applies to Lemmy: should I quote you? will you edit what you said?... tomorrow, or in 10 years?... maybe I'll risk it, this time)
"Zero tolerance" becomes a problem when the system requires you to quote, but then some months or years later decides to change the rules and applies them retroactively. I still wouldn't mind if they just flagged, hid, or removed the comment, it's the "go on a treasure hunt to find out why you got banned" that I find insulting (kind of like the "wrong login"... /jk, you got banned. Wonder if it's been fixed in Lemmy already, I know of some sites that haven't for the last 15 years).
Quotes are not uneditable⌠but neither are comments.
You kinda get into an ouroborus of who has fewer edits, and honestly I don't know how to solve for that, but I do know that if you had substituted "n-word" for the slur it would look exactly the same if the OP edited the comment after the fact. Quoting the slur doesn't mitigate that.
âZero toleranceâ becomes a problem when the system requires you to quote, but then some months or years later decides to change the rules and applies them retroactively. I
Any policy becomes a problem at that point. It becomes less of a policy and more of a guideline
I really struggle to take seriously what these tech people say about ânot wanting to censorâ. They made a business calculation, and maybe an ideological one, and decided âwe want that nazi money, itâs worth it to us.â which really tells you everything about a company and how it is likely to approach other issues, too.
It's also disingenuous because they already decline to host sex workers newsletters. So if the censorship angle was true, they're already censoring.
RIGHT. Thank you for pointing this out.
:::
Right, and if the profit motive angle were true, theyâre already violating that by censoring the sex workers you just mentioned.
So that eliminates profit as the reason for their actions here
Yes! I love this simplification!
I feel like they always try to make it sound more complicated and high minded. I really donât believe it is!
What do you call a company that puts profits above all? A company.
Last time I asked for advice about registering a nonprofit, I was told "but you don't yet have enough profits to use a nonprofit for tax evasion" đ
Iâm not sure I understand your point here. Everyone from a sole proprietor to a mega corporation is in it for profit. Just because the upper one percent is dodgy as hell and plays fast and loose with the tax code doesnât mean every single company in existence is terrible or out to do sketchy business. Iâm pretty happy with mine. I wouldnât be there if I wasnât working with honest people.
My point is that it's nothing to be surprised about when a company makes a decision to increase its profit.
As for the rest, getting a profit from your work, is called "a job". Companies are created to get a profit in excess of whatever job the owners are doing, otherwise it's called a "non-profit"... for the owners in excess of their job at the company, which they still get paid for.
I don't know the company you're working for, but if it has any profits that don't revert to the people doing the job, or the amortization of the initial investment, then the owners are "skimming off the top" from everyone.
The people I asked for advice, from the corporate world, were so entrenched in that same "profit first" mentality, that they couldn't even grasp the idea of only getting paid for your actual work, and only saw non-profits as a tool for tax evasion.
Thank you for clarifying your position. I'm not in full agreement but I respect the points you brought up here. Cheers đť
Not gonna lie. I've never heard of Substack but I appreciate their stance of publicly announcing why I would continue to avoid them.
My only interaction with Substack is that one podcast moved there for premium content. I thought it was mostly for written newsletters, which I always wondered how much of a market there actually is for paying for one newsletter, but then again I guess it's just the written version of podcasts so I guess there is a market. Though promoting Nazi content gives me a lot of pause.
techno-libertarianism strikes again! it's every few years with these guys where they have to learn the same lesson over again that letting the worst scum in politics make use of your website will just ensure all the cool people evaporate off your website--and Substack really does not have that many cool people or that good of a reputation to begin with.
They just really, really love running Nazi bars. They just don't like it when the normies realizes that the neighbourhood bar is a Nazi bar.
i go back and forth on how much i think this tendency's willingness to host content like this and/or go to the mat for it is agreement and how much of it is just stupidity or ill-conceived ideology. a lot of these guys seem like they agree with elements of fascism, but a lot of them are also... just not smart.
The Nazi bar analogy says nothing about agreement. Just that failing to remove Nazis from your bar is a great way to flood your bar with Nazis, because once they know its safe for them to Nazi it up in your establishment, they'll tell their friends about you.
If you don't proactively remove the Nazis, you're creating a Nazi safe space, whether you agree with them or not.
i am familiar with the analogy, but i think it would obviously be worse if they agree with what they're platforming instead of just being kind of half-baked morons who don't have good political positions or cynically platforming it because it makes them money. one can, in effect, be remedied by showing them social or financial retribution, but the other would be a manifestation of a much more serious social problem that cannot be immediately dealt with
I think you have to look at the money here. The most charitable view for substack is their payment provider doesn't ban Nazis, and their VC funders don't want them to ban Nazis, and so they don't really have a choice.
I think substack is well up for being a nazi bar based on what they've said so I'm happy to give them some blame but I won't be letting the other two off the hook either.
Joyce Carol Oates is there; She counts for hundreds of cool people; I think some other writers make use of it too. I hope they voice their discontent.
Nazis find a way to ruin every fucking thing. I really believe certain groups of people should not have right to free speech. In 2024, we should be well-aware that tolerating intolerance does not work. Just fucking look around and take a look at what these people are doing with their free speech. I am not the gatekeeper or good morals and the bastion of good values. Some ideologies are objectively bad, though.
Translation: "We support Nazis and would like to offer them passive protection. If you have a problem with them, we will ban you"
Any writers still on SubStack need to immeadiately look at alternative options and shift their audiences to other platforms. To stick around on the site when the founder straight up condones neo nazis and not only gives them a platform, but profit shares with them and their nazi subscribers is insane.
If they plan to do business in the EU this is illegal.
Well, you can create an account from EU, although mine got locked after creating just one blog post. And the support does not seem to respond, so I moved to a different platform.
Reading about this at work the other day, I announced to my coworkers that Substack is officially bad. Profiting off of nazi propaganda is bad. Fuck Substack.
I had recently subscribed to the RSS feed for The Friendly Atheist and was considering monetary support. They accept via Substack or Patreon. I would have opted for Patreon anyway, because that's where I already have subscriptions. But after learning about this, I'll never support anything, no matter what, via Substack. Eat my ass, shitheads.
What do you mean banning doesn't work? The less reach those Nazis have the less people can see their Nazi-Posts and get turned into Nazis. Also it needs to be clear that being a Nazi is not acceptable so they don't have the courage to spread their hate. This bullshit needs to stop.
Nope, never supporting anything from substacks again.
"Freeze peach" libertarians can go to hell.
if you say nazi and white supremacist content is just a "different point of view", you support nazi and white supremacist content. period.
and it's not surprising since lulu meservey's post on twitter when the whole situation with elon basically abandoning moderation.
"Substack is hiring! If youâre a Twitter employee whoâs considering resigning because youâre worried about Elon Musk pushing for less regulated speech⌠please do not come work here."
The problem is that some people are quick to call things Nazi and white supremacist, when it's actually just something they disagree with.
That's not the problem at all. If you support fascists then you support Nazi's and white supremacy.
There's a lot of empirical claims surrounding this topic, and I'm unaware who really has good evidence for them. The Substack guy e.g. is claiming that banning or demonetising would not "solve the problem" â how do we really know? At the very least, you'd think that demonetising helps to some extent, because if it's not profitable to spread certain racist ideas, that's simply less of an incentive. On the other hand, plenty of people on this thread are suggesting it does help address the problem, pointing to Reddit and other cases â but I don't think anyone really has a grip on the empirical relationship between banning/demonetising, shifting ideologues to darker corners of the internet and what impact their ideas ultimately have. And you'd think the relationship wouldn't be straightforward either â there might be some general patterns but it could vary according to so many contingent and contextual factors.
I agree it's murky. Though I'd like to note that when you shift hateful ideologues to dark corners of the internet, that also means making space in the main forums for people who would otherwise be forced out by the aforementioned ideologues - women, trans folks, BIPOC folks, anyone who would like to discuss xyz topic but not at the cost of the distress that results from sharing a space with hateful actors.
When the worst of the internet is given free reign to run rampant, it has a tendency to take over the space entirely with hate speech because everything and everyone else leaves instead of putting up with abuse, and those who do stay get stuck having the same, rock bottom level conversations (e.g. those in which the targets of the hate are asked to justify their existence or presence or right to have opinions repeatedly) over and over with people who aren't really interested in intellectual discussions or solving actual problems or making art that isn't about hatred.
But yeah, as with anything involving large groups of people, these things get complicated and can be unpredictable.
Thank you! Even on lemmy I find the atmosphere often oblivious or ignorant to marginalized views. The majority here are cis men (regarding the poll earlier this year) and it certainly shows. And the people here are probably mostly left-leaning? So I definitely couldn't imagine sharing a space with anyone more right-leaning than that.
It depends a lot on the instance IMO. I didn't like the attitude at lemmy.ml but I like it here at beehaw. I'm very left-leaning, progressive and LGBTQ+ friendly.
Lemmy.ml and lemmy.world are more right-leaning as far as I can see.
Yes sure, beehaw is more progressive. But still I sometimes don't feel so comfortable in its community because it can at times feel very male-centered.
The problem when you own a space that if you let certain groups of people in, such as, in this example, Nazis, you'll literally drive everyone else away from your space, so that what started off as a normal, ordinary space will become, essentially, a Nazi bar.
It's not only Nazis â it can be fascists, white supremacists, meth-heads, PUAs, cryptocurrency fanboys â some groups will be so odious to others that they will drive everyone else from your space, so the only solution that you can enact is to ensure that they don't come to your place, even if they're nice and polite and "follow your rules", because while they might, their friends won't, those friends have a history of driving away other people from other spaces.
"you have to nip it in the bud immediately. These guys come in and it's always a nice, polite one. And you serve them because you don't want to cause a scene. And then they become a regular and after awhile they bring a friend. And that dude is cool too.
And then THEY bring friends and the friends bring friends and they stop being cool and then you realize, oh shit, this is a Nazi bar now. And it's too late because they're entrenched and if you try to kick them out, they cause a PROBLEM. So you have to shut them down.
What evidence did you find to support Substackâs claims? They didnât share any.
You can quickly and easily find good evidence for things like Reddit quarantining and the banning of folks like Alex Jones and Milo Yiannopoulos.
Which claims are empirical again?
Thereâs a lot of empirical claims surrounding this topic, and Iâm unaware who really has good evidence for them. The Substack guy e.g. is claiming that banning or demonetising would not âsolve the problemâ â how do we really know?
Well it depends what you define as "the problem".
If you define it as Nazis existing per se, banning them does not "solve the problem" of nazis existing. They will just go elsewhere. A whole world war was not enough to get rid of them.
However, allowing them on mainstream platforms does make their views more prevalent to mainstream users and some might fall for their propaganda similar to the way people get attracted to the Qanon nonsense. So if you define the problem as "Nazis gaining attention" then yeah sure. It certainly does "solve the problem" to some degree. And I think this is the main problem these days (even in the Netherlands which is a fairly down to earth country, the fascists gained 24% of the votes in the last election!)
However however you define "the problem" making money off nazi propaganda is just simply very very bad form. And will lead to many mainstream users bugging out, and rightly so.
we also do know that going after nazis and white supremacists works since all through the 90s they were relegated to the fringe of the fringe corners on the internet.
I always hate policy talk trying to split the hairs of Nazism and âcalls for violenceâ.
Even worse, I just canât get allowing monetization. If you truly âhate the viewsâ, stop lining your pocket with their moneyâŚ
The only thing they hate is not taking their money.
There are too many of these goddamned social networks anyway. After Twitter/X exploded, everyone else wanted to grab a piece of that pie, and now we've got a dozen social networks nobody uses.
If you want a progressive social network that doesn't take shit from goosesteppers, Cohost is probably the place to go. It's so neurodivergent and trans-friendly that I can't imagine them blithely accepting Nazi content. It's just not how Cohost works. "Blah blah blah, free speech!" Not here, chumps. We've got standards. Go somewhere else to push that poison.
Substack started so well... It was looking like the new Medium (after medium totally enshittified). But the discovery was never very good there, and now this. Nope. Not going to blog there.
I wonder if Snowden still supports them.
probably since snowden thinks putin is amazing
Does he really? I think it's more like he got stuck there on his way to Ecuador and now he has no alternative but to "like" Putin :P
After all his plan was never to stay in Russia.
i mean "Edward Snowden gets Russian passport after swearing oath of allegiance. Whistleblower is âhappy and thankful to the Russian Federationâ for his citizenship, lawyer says"
I know.. But the point is, he's stuck there.
He had 2 choices:
Play ball and swear his oath and suck up a little to the godfather and live happily ever after
Kick up a stink against Putin and find himself falling out of a closed window (or, best case, being deported and spending his days in a max security prison).
It's not really like "free will" applies here :)
Whatever the lawyer said is just the minimum required decorum IMO. Just politics. The oath is probably simply required to get the passport.
Putin got to get one-over on the US and Snowden got to stay out of prison (well, in reality a really huge prison but still...). It's a marriage of convenience.
Does nobody remember the exact same dust-up the day after Substack launched?
So they are complacent with it very well. If you are complacent with Nazis, to me, you're a Nazi. I don't give a shit. What's the saying that the Germans have? Like there are six guys at a table in a bar and one of them is a Nazis, therefore there are six Nazis at the table? Yeah, that.
As always, there are several different aspects:
Promoting [Nazi propaganda and misinformation]
Laughing at [...]
Analyzing [...]
Profiting from [...]
Sometimes the difference between "promotion", "laughing at", and "analysis", depends more on the reader's approach, that on the writer's intent.
Then again, sometimes a reader decides they don't want to deal with any of it, which is also respectable.
Look. If there are 9 people at a table sitting with 1 Nazi and hanging out, there are 10 Nazis.
That's the quick, easy, and wrong approach.
What are 9 non-Nazi people supposed to do: kick the 1 Nazi to a Nazi-only table? Leave the table and now have 2 Nazi-only tables? Get everyone thrown out?
Nazism works like any other sect; what converted people need, is exposure to other ways of thinking, ideally some human connections with people whom the sect demonizes and tries to keep members away from. Pushing sect members away from society, is precisely what the sect wants!
I'm not saying that you personally, or even me, should be the ones to do that, or that we should idly watch, or not have a Nazi-free table.
What I'm saying is that non-Nazis putting up with a Nazi in order to de-program them, should be praised, and that you can't tell what's really going on just by watching who sits where.
This is too complex and nuanced. You punch the nazi and throw them out of the establishment. That's it. That's all.
That's how you reinforce the indoctrination of "us vs. them", and how they're the oppressed ones. Far from "it" or "all".
Yes. I don't care if they feel oppressed. Tolerance does not extend to Nazis. They SHOULD feel oppressed. I will not entertain being soft on Nazis. I will not entertain "seeing both sides". I will not entertain Nazis.
I don't think you understand. Instilling a feeling of oppression, then a desire to "fight against the oppression", is how Nazis get created. Nobody's asking you to tolerate it, or see any "both sides", there are no sides. The only side is whether you work with or against their indoctrination.
All I'm saying is you better double check those 9 other people at that table, in case they're doing the hard work you or me don't want to.
I think I may not have made my position clear. If there is 1 Nazi at the table, and 9 people are just sitting with them hanging out... you have 10 Nazis. They are not "doing the hard work" otherwise it the 1 Nazi would not be sitting there. I'm hearing a lot of sympathizing towards Nazis in this thread.
You have made your position very clear, but if what you got from what I'm saying is "sympathizing towards Nazis", then you've missed my point completely, and I don't know how to make it any clearer. đ¤ˇ
I just want to state very emphatically that deradicalizing people is a specific skill set and set of actions that is completely different than âbeing friendly to nazisâ. And tolerating bigotry so that people donât feel bad about their bigotry is just tolerating bigotry.
On that note, on another post you argued heavily with multiple users that white privilege is not real and that you were being oppressed for your whiteness. I thought maybe you were very young, or confused, and tried to have empathy and explain some concepts, but here you are now also arguing that we need to be nice to nazis for the good of society so that they donât feel oppressed. I suppose you might say that pointing this out and making you feel more oppressed would drive you further away, but a better approach, i think, would be to tell you very very directly that the things you have been saying here, in multiple places, are white supremacist talking points. And no one here is going to condone that. Stop. If you need help stopping, that is your responsibility, not the hypothetical 9 other peoplesâ.
How can you tell "deradicalizing" from "being friendly" by just seeing people sit at the same table? One of the strategies for deradicalization, is precisely having positive experiences with people the group/sect has vilified.
here you are now also arguing that we need to be nice to nazis
No. In a nazi-free area, just kick the nazi, that's easy. What I'm arguing is that you should give the benefit of the doubt to the other 9 people. Don't assume.
on another post you argued heavily with multiple users that white privilege is not real
Let me clear that up:
White privilege is real. So are others.
I've always tried to avoid using my privileges.
People assuming I'm too privileged have fucked me over.
These are not either-or.
I thought maybe you were very young, or confused, and tried to have empathy and explain some concepts
Not very young, maybe confused, maybe living in a different society, most likely with different life experiences.
Thank you for trying to explain some concepts. I learned some stuff in that other post, and I'm grateful for that (even if the conclusion was depressing).
the things you have been saying here, in multiple places, are white supremacist talking points
If you need help stopping, that is your responsibility
Is this another "learn the book of forbidden words" situation?
I refused to read into white supremacist propaganda any further than seeing their basic manipulation strategies, and that was a few decades ago. Are you asking me to read the updated version?
Over the past several days Iâve seen you draw out many good faith disagreements about racism or nazism into what seem like intentionally blurry âjust asking questionsâ type derailments whereby you try to shift the topic of the discussion to other, emotional or tangential details and or try to misrepresent the issue at hand to make the racism or nazism seem not that bad. I really donât think someone would do that if they were coming from a place of genuine confusion or curiosity or dialogue. I might be wrong, but taken together it really gives the impression, intent aside, that youâre trying to spin up plausible arguments for far right stuff and then sow confusion whenever people say âhey, donât do that, itâs harmfulâ. I just donât believe thereâs wiggle room here. I donât want to have a circular conversation about it, but i do want to point out directly what youâre doing, because I think it sucks, and I think that you should stop.
Ok, let's be direct: I'm against nazism, racism, sexism, pretty much the concept of -isms itself, and a techno-anarcho-communist at heart. I generally try to avoid putting too much of my own bias into things, and I do have a tendency to focus on exploring a single aspect of an argument (you could call it "tangential details")... but if you see me use anything resembling "far right arguments", then it means I've gone too far and I will be grateful if you, or anyone else, call me out on it.
Is that acceptable?
you try to shift the topic of the discussion to other, emotional or tangential details
I think I've only pulled one emotional tangent, just because it's impacting me personally right now. It's hard to be objective about that. But I found the following discussion educative, so... I'm serious: thanks everyone for answering.
PS: merry holidays đđĽ
Over the past several days Iâve seen you draw out many good faith disagreements about racism or nazism into what seem like intentionally blurry âjust asking questionsâ type derailments whereby you try to shift the topic of the discussion to other, emotional or tangential details and or try to misrepresent the issue at hand to make the racism or nazism seem not that bad.
This does not appear to me at all what is happening, at least in this thread, and I would even go as far as to call it gaslighting.
The other user literally said if 10 people are at a table and 1 is a Nazi, then all 10 are Nazis. They have also labelled any opposing view as "sympathizing towards Nazis" in another comment. That is pretty damn fucking far from good faith. And yet, somehow, because this other user pointed out the problem with this type of thinking, you are now accusing them of not being good faith? Are you serious? People are refusing to have any kind of nuanced view of the situation, accusing everyone in that situation of being a Nazi and people who disagree of being sympathizers, but somehow the other person is the one not acting in good faith, or using emotional arguments?
I really don't want to be rude, but your comment reads like textbook projection. They also never said anything to defend Nazis or the far right, not once (*), so that makes you the one who is misrepresenting what they are saying and doing. I encourage you to keep everything you said in mind, but re-read the thread through a more objective lens.
I really didn't want to get involved in this conversation, but some of these comments really frustrated me, and yours was just the straw that broke the camel's back; I had to let some of the frustration out. If you just want to ignore me, that's fine.
(*) At least as far as this discussion is concerned; I do not have an all seeing eye.
The other user is right. Any tolerance for those ideologies gives them a foothold. It makes room for them. There can be no room, at all, for that shit. Ever. Arguing otherwise is dangerous. There is no nuance here. At all.
No. In a nazi-free area, just kick the nazi, thatâs easy. What Iâm arguing is that you should give the benefit of the doubt to the other 9 people. Donât assume.
All areas should be nazi-free areas. If any of them are truly attempting to de-radicalize someone they would know that there is a time and place for it, and out in general society is not it.
Is this another âlearn the book of forbidden wordsâ situation?
This is very reductive, dismissive and exactly the kind of thing a white supremacist would say to try and justify saying something shitty. Which is exactly PotentiallyAnApricot's point. I could chock it up to naivety, but just as an outside observer on this thread and others, I don't think it is.
I refused to read into white supremacist propaganda any further than seeing their basic manipulation strategies, and that was a few decades ago. Are you asking me to read the updated version?
Dismantling systemic oppression personally, interpersonally, and in greater society is a constant process. Especially as a white person. Saying you did some research decades ago and are all good now is not how it works.
In a nazi-free area, just kick the nazi
All areas should be nazi-free areas.
Now, this is reductive and dismissive. "Tow it outside the environment" is not an option, you can only create reservation, concentration, and general areas.
If any of them are truly attempting to de-radicalize someone they would know that there is a time and place for it, and out in general society is not it.
Part of de-radicalization is reinsertion into general society. Can't be done outside. Studies have shown that de-radicalization actually lags way behind reinsertion. What you're proposing are lifelong reeducation camps.
is a constant process. Especially as a white person.
The what? Not sure if you realize, but stuff like this "Especially as a [whatever]" is what pushes people over the line.
Dismantling systemic oppression personally, interpersonally [...]
Saying you did some research decades ago and are all good now is not how it works.
This is not what I said.
Now, this is reductive and dismissive. âTow it outside the environmentâ is not an option
It is very much an option, and one that works. You can't have an inclusive society if you accept members that want other members dead.
Part of de-radicalization is reinsertion into general society.
After they are no longer a nazi. If they are still a nazi, they belong outside society.
stuff like this âEspecially as a [whatever]â is what pushes people over the line.
Ok, to be be less specific, especially as the people that benefit from systemic oppression we have to keep more on it than those do not benefit from it. I used white people because we are currently the largest benefactors of systemic oppression. If that "pushes you over the line" you were too close to line to begin with.
This is not what I said.
The meaning most people get from what you said is that you learned how white supremacists manipulate people decades ago, then disengaged. Again, in order to keep getting better, you have to keep learning, and for fucking sure chuds will keep trying to distract and minimize.
i never used substack before and they are doing a good job making sure i never do. i hope they like being the nazi bar.
đ¤ I'm a bot that provides automatic summaries for articles:
::: spoiler Click here to see the summary
While McKenzie offers no evidence to back these ideas, this tracks with the companyâs previous stance on taking a hands-off approach to moderation.
In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions.
âWeâre not going to get into specific âwould you or wonât youâ content moderation questionsâ over the issue of overt racism being published on the platform, Best said.
In a 2020 letter from Substack leaders, including Best and McKenzie, the company wrote, âWe just disagree with those who would seek to tightly constrain the bounds of acceptable discourse.â
The Atlantic also pointed out an episode of McKenzieâs podcast with a guest, Richard Hanania, who has published racist views under a pseudonym.
McKenzie does, however, cite another Substack author who describes its approach to extremism as one that is âworking the best.â What itâs being compared to, or by what measure, is left up to the readerâs interpretation.
It is true that removing and demonetising Nazi content wouldn't make the problem of Nazis go away. It would just be moved to dark corners of the internet where the majority of people would never find it, and its presence on dodgy-looking websites combined with its absence on major platforms would contribute to a general sense that being a Nazi isn't something that's accepted in wider society. Even without entirely making the problem go away, the problem is substantially reduced when it isn't normalised.
the weirdest thing to me is these guys always ignore that banning the freaks worked on Reddit--which is stereotypically the most cringe techno-libertarian platform of the lot--without ruining the right to say goofy shit on the platform. they banned a bunch of the reactionary subs and, spoiler, issues with those communities have been much lessened since that happened while still allowing for people to say patently wild, unpopular shit
Yep! Reddit is still pretty awful in many respects (and I only even bother with it for specific communities for which I haven't found a suitable active equivalent on Lemmy - more frogs and bugs on Lemmy please), but it did get notably less unpleasant when the majority of the truly terrible subs were banned. So it does make a difference.
I feel like "don't let perfect be the enemy of good" is apt when it comes to reactionaries and fascists. Completely eliminating hateful ideologies would be perfect, but limiting their reach is still good, and saying "removing their content doesn't make the problem go away" makes it sound like any effort to limit the harm they do is rendered meaningless because the outcome is merely good rather than perfect.
They took way too long unfortunately , but totally agree. thedonald, femaledatingstrategy and fatpeoplehate should have been banned a lot quicker
It feels like they've let it degrade again too now. Last I was on it, lots of subs had gone really toxic and weird
You're literally on a platform that was created to harbor extremist groups. Look at who Dessalines is, (aka u/parentis-shotgun) and their self-proclaimed motivation for writing LemmyNet. When you ban people from a website, they just move to another place, they are not stupid it's pretty easy to create websites. It's purely optical, you're not saving civilisation from harmful ideas, just preventing yourself from seeing it.
you are literally describing an event that induces the sort of entropy we're talking about here--necessarily when you ban a community of Nazis or something and they have to go somewhere else, not everybody moves to the next place (and those people diffuse back into the general population), which has a deradicalizing effect on them overall because they're not just stewing in a cauldron of other people who reinforce their beliefs
"A deradicalising effect"
I'm sorry what? The idea that smaller communities are somehow less radical is absurd.
I think you are unaware (or much more likely willfully ignoring) that communities are primarily dominated by a few active users, and simply viewed with a varying degree of support by non-engaging users.
If they never valued communities enough to stay with them, then they never really cared about the cause to begin with. These aren't the radicals you need to be concerned about.
"And those people diffuse back into the general population"
Because that doesn't happen to a greater degree when exposed to the "general population" on the same website?
i'd like you to quote where i said this--and i'm just going to ignore everything else you say here until you do, because it's not useful to have a discussion in which you completely misunderstand what i'm saying from the first sentence.
The deradicalizing effect occurs in the people who do not follow the fringe group to a new platform.
Many people lurk on Reddit who will see extremist content there and be influenced by it, but who do not align with the group posting it directly, and will not seek them out after their subreddit or posted content is banned.
Sure but what degree of influence is actually "radicalising" or a point of concern?
We like to pretend that by banning extreme communities we are saving civilisation from them. But the fact is that extreme groups are already rejected by society. If your ideas are not actually somewhat adjacent to already held beliefs, you can't just force people to accept them.
I think a good example of this was the "fall" of Richard Spencer. All the leftist communities (of which I was semi-active in at the time) credited his decline with the punch he received and apparently assumed that it was the act of punching that resulted in his decline, and used it to justify more violent actions. The reality is that Spencer just had a clique of friends that the left (and Spencer himself) interpreted as wide support and when he was punched the greater public didn't care because they never cared about him.
Whom are we talking about here, the ones who get kicked out and seek each other in a more concentrated form, or the ones who are left behind without the radicalizing agents?
I don't want to have to deal with Nazis, or several other sects, but I don't think forcing them into a smaller echo chamber is helping either.
Ideally, I think a social platform should lure radicalizing agents, then expose them to de-radicalizing ones, without exposing everyone else. Might be a hard task to achieve, but worth it.
You really think this works? I don't. I just see them souring the atmosphere for everyone and attracting more mainstream users to their views.
We've seen in Holland how this worked out. The nazi party leader (who chanted "Less Moroccans") won the elections by a landslide a month ago. There is a real danger of disenchanted mainstreamers being attracted to nazi propaganda in droves. We're stuck with them now for 4 years (unless they manage to collapse on their own, which I do hope).
No, that's why I said "Ideally", meaning it as a goal.
I don't think we have the means to do it yet, or at least I don't know of any platform working like that, but I have some ideas of how some of it could be done. Back in the days of Digg, with some people, we spitballed some ideas for social networks, among them a movie ranking one (that turned out to be a flop because different people would categorize films differently), and a kind of PageRank for social networks, that back then was computationally impractical. But with modern LLMs running trillions of parameters, and further hardware advances, even O(n²) with n=millions becomes feasible in real time, and in practice it wouldn't need to do nearly that much work. With the right tuning, and dynamic message visibility, I think something like that could create the exact echo chambers that would attract X people, allow in des-X people, while keeping everyone else out and unbothered.
Of course there is a dark side, in that a platform could use the same strategy to mold the opinion of any group... and I wouldn't be surprised to learn that Meta had been doing exactly that.
I'd argue that it still broke Reddit.
Back in the day, I might say something out of tone in some subreddit, get the comment flagged, discuss it with a mod, and either agree to edit it or get it removed. No problem.
Then Reddit started banning reactionary subs, subs started using bots to ban people for even commenting on other blacklisted subs, subs started abusing automod to ban people left and right, even quoting someone to criticize them started counting as using the same "forbidden words", conversations with mods to clear stuff up pretty much disappeared, application of modern ToS retroactively to 10 year old content became a thing... until I got permabanned from the whole site after trying to recur a ban, with zero human interaction. Some months later, while already banned sitewide, they also banned me from some more subs.
Recently Reddit revealed a "hidden karma" feature to let automod pre-moderate potentially disruptive users.
Issues with the communities may have lessened, but there is definitely no longer the ability to say goofy, wild, or unpopular stuff... or in some cases, even to criticize them. There also have been an unknown number of "collateral damage" bans, that Reddit doesn't care about anymore.
imo if reddit couldn't survive "purging literally its worst elements, which included some of the most vehement bigotry and abhorrent content outside of 4chan" it probably doesn't deserve to survive
I see it as a cautionary tale about relying too much on automated mod tools to deal with an overwhelming userbase. People make mistakes, simple tools make more.
@jarfil @alyaza i have said plenty of wild stuff and haven't been banned from any subs? None of it has been bigoted tho
The only time I got banned for bigoted stuff, was precisely for quoting someone's n-word and calling them out on it. Automod didn't care about the context, no human did either. Also got banned for getting carried away and making a joke in a "no jokes" (zero tolerance) sub. Several years following the rules didn't grant me even a second chance. Then was the funny time when someone made me a mod of a something-CCP sub, and automatically several other subs banned me.
There is a lot more going on Reddit than what meets the eye, and they like to keep it out of sight.
It sounds like the right call was made (as long as both you and the OP were banned). As a white person, there is no reason for you to use the n-word. In that situation simply changing it to "n-word" is the very least that could have been done
I'm not really sure how that provides and example of stuff going on in the background that someone wants to keep out of sight.
The thing is I did not "use" it, just quoted their whole message. In hindsight, maybe I should have changed it, but I still find it a flaw to not take context into account.
It provides an example of context-less rules blindly applied by a machine, with no public accountability of what happened, much less of the now gone context.
There are many better ways of handling those cases, like flagging the comment with a content warning, maybe replacing the offensive words, or locking it for moderation, instead of disappearing everything. I didn't have half a chance of fixing things, had to use reveddit to just guess what I might've done wrong.
The thing is, no context would have made it OK. You may have just been quoting someone, but you still used the word in the quote. Quotes are not some uneditable thing, so it was your choice to leave it in. Zero tolerance for hate means repeating the hateful thing is also not tolerated, and that, IMO, is a good thing and the perfect use of an auto-mod.
The other examples are a bit nebulous, and I have no doubt that communities on reddit have esoteric moderation guidelines, but this particular example seems pretty cut and dry.
Quotes are not uneditable... but neither are comments.
Wouldn't be the first time when the parent gets edited to make a reply look like nonsense, so I got used to quoting as a countermeasure. Then they unlocked comment editing even in 10 year old "archived" posts 𤌠(BTW, the same applies to Lemmy: should I quote you? will you edit what you said?... tomorrow, or in 10 years?... maybe I'll risk it, this time)
"Zero tolerance" becomes a problem when the system requires you to quote, but then some months or years later decides to change the rules and applies them retroactively. I still wouldn't mind if they just flagged, hid, or removed the comment, it's the "go on a treasure hunt to find out why you got banned" that I find insulting (kind of like the "wrong login"... /jk, you got banned. Wonder if it's been fixed in Lemmy already, I know of some sites that haven't for the last 15 years).
You kinda get into an ouroborus of who has fewer edits, and honestly I don't know how to solve for that, but I do know that if you had substituted "n-word" for the slur it would look exactly the same if the OP edited the comment after the fact. Quoting the slur doesn't mitigate that.
Any policy becomes a problem at that point. It becomes less of a policy and more of a guideline
I really struggle to take seriously what these tech people say about ânot wanting to censorâ. They made a business calculation, and maybe an ideological one, and decided âwe want that nazi money, itâs worth it to us.â which really tells you everything about a company and how it is likely to approach other issues, too.
It's also disingenuous because they already decline to host sex workers newsletters. So if the censorship angle was true, they're already censoring.
RIGHT. Thank you for pointing this out.
:::
Right, and if the profit motive angle were true, theyâre already violating that by censoring the sex workers you just mentioned.
So that eliminates profit as the reason for their actions here
Yes! I love this simplification!
I feel like they always try to make it sound more complicated and high minded. I really donât believe it is!
What do you call a company that puts profits above all? A company.
Last time I asked for advice about registering a nonprofit, I was told "but you don't yet have enough profits to use a nonprofit for tax evasion" đ
Iâm not sure I understand your point here. Everyone from a sole proprietor to a mega corporation is in it for profit. Just because the upper one percent is dodgy as hell and plays fast and loose with the tax code doesnât mean every single company in existence is terrible or out to do sketchy business. Iâm pretty happy with mine. I wouldnât be there if I wasnât working with honest people.
My point is that it's nothing to be surprised about when a company makes a decision to increase its profit.
As for the rest, getting a profit from your work, is called "a job". Companies are created to get a profit in excess of whatever job the owners are doing, otherwise it's called a "non-profit"... for the owners in excess of their job at the company, which they still get paid for.
I don't know the company you're working for, but if it has any profits that don't revert to the people doing the job, or the amortization of the initial investment, then the owners are "skimming off the top" from everyone.
The people I asked for advice, from the corporate world, were so entrenched in that same "profit first" mentality, that they couldn't even grasp the idea of only getting paid for your actual work, and only saw non-profits as a tool for tax evasion.
Thank you for clarifying your position. I'm not in full agreement but I respect the points you brought up here. Cheers đť
Not gonna lie. I've never heard of Substack but I appreciate their stance of publicly announcing why I would continue to avoid them.
My only interaction with Substack is that one podcast moved there for premium content. I thought it was mostly for written newsletters, which I always wondered how much of a market there actually is for paying for one newsletter, but then again I guess it's just the written version of podcasts so I guess there is a market. Though promoting Nazi content gives me a lot of pause.
techno-libertarianism strikes again! it's every few years with these guys where they have to learn the same lesson over again that letting the worst scum in politics make use of your website will just ensure all the cool people evaporate off your website--and Substack really does not have that many cool people or that good of a reputation to begin with.
They just really, really love running Nazi bars. They just don't like it when the normies realizes that the neighbourhood bar is a Nazi bar.
i go back and forth on how much i think this tendency's willingness to host content like this and/or go to the mat for it is agreement and how much of it is just stupidity or ill-conceived ideology. a lot of these guys seem like they agree with elements of fascism, but a lot of them are also... just not smart.
The Nazi bar analogy says nothing about agreement. Just that failing to remove Nazis from your bar is a great way to flood your bar with Nazis, because once they know its safe for them to Nazi it up in your establishment, they'll tell their friends about you.
If you don't proactively remove the Nazis, you're creating a Nazi safe space, whether you agree with them or not.
i am familiar with the analogy, but i think it would obviously be worse if they agree with what they're platforming instead of just being kind of half-baked morons who don't have good political positions or cynically platforming it because it makes them money. one can, in effect, be remedied by showing them social or financial retribution, but the other would be a manifestation of a much more serious social problem that cannot be immediately dealt with
I think you have to look at the money here. The most charitable view for substack is their payment provider doesn't ban Nazis, and their VC funders don't want them to ban Nazis, and so they don't really have a choice.
I think substack is well up for being a nazi bar based on what they've said so I'm happy to give them some blame but I won't be letting the other two off the hook either.
Joyce Carol Oates is there; She counts for hundreds of cool people; I think some other writers make use of it too. I hope they voice their discontent.
Nazis find a way to ruin every fucking thing. I really believe certain groups of people should not have right to free speech. In 2024, we should be well-aware that tolerating intolerance does not work. Just fucking look around and take a look at what these people are doing with their free speech. I am not the gatekeeper or good morals and the bastion of good values. Some ideologies are objectively bad, though.
Translation: "We support Nazis and would like to offer them passive protection. If you have a problem with them, we will ban you"
Any writers still on SubStack need to immeadiately look at alternative options and shift their audiences to other platforms. To stick around on the site when the founder straight up condones neo nazis and not only gives them a platform, but profit shares with them and their nazi subscribers is insane.
If they plan to do business in the EU this is illegal.
Well, you can create an account from EU, although mine got locked after creating just one blog post. And the support does not seem to respond, so I moved to a different platform.
Reading about this at work the other day, I announced to my coworkers that Substack is officially bad. Profiting off of nazi propaganda is bad. Fuck Substack.
I had recently subscribed to the RSS feed for The Friendly Atheist and was considering monetary support. They accept via Substack or Patreon. I would have opted for Patreon anyway, because that's where I already have subscriptions. But after learning about this, I'll never support anything, no matter what, via Substack. Eat my ass, shitheads.
What do you mean banning doesn't work? The less reach those Nazis have the less people can see their Nazi-Posts and get turned into Nazis. Also it needs to be clear that being a Nazi is not acceptable so they don't have the courage to spread their hate. This bullshit needs to stop.
Nope, never supporting anything from substacks again. "Freeze peach" libertarians can go to hell.
if you say nazi and white supremacist content is just a "different point of view", you support nazi and white supremacist content. period.
and it's not surprising since lulu meservey's post on twitter when the whole situation with elon basically abandoning moderation.
"Substack is hiring! If youâre a Twitter employee whoâs considering resigning because youâre worried about Elon Musk pushing for less regulated speech⌠please do not come work here."
https://www.inverse.com/input/culture/substack-hiring-elon-musk-tweet
The problem is that some people are quick to call things Nazi and white supremacist, when it's actually just something they disagree with.
That's not the problem at all. If you support fascists then you support Nazi's and white supremacy.
There's a lot of empirical claims surrounding this topic, and I'm unaware who really has good evidence for them. The Substack guy e.g. is claiming that banning or demonetising would not "solve the problem" â how do we really know? At the very least, you'd think that demonetising helps to some extent, because if it's not profitable to spread certain racist ideas, that's simply less of an incentive. On the other hand, plenty of people on this thread are suggesting it does help address the problem, pointing to Reddit and other cases â but I don't think anyone really has a grip on the empirical relationship between banning/demonetising, shifting ideologues to darker corners of the internet and what impact their ideas ultimately have. And you'd think the relationship wouldn't be straightforward either â there might be some general patterns but it could vary according to so many contingent and contextual factors.
I agree it's murky. Though I'd like to note that when you shift hateful ideologues to dark corners of the internet, that also means making space in the main forums for people who would otherwise be forced out by the aforementioned ideologues - women, trans folks, BIPOC folks, anyone who would like to discuss xyz topic but not at the cost of the distress that results from sharing a space with hateful actors.
When the worst of the internet is given free reign to run rampant, it has a tendency to take over the space entirely with hate speech because everything and everyone else leaves instead of putting up with abuse, and those who do stay get stuck having the same, rock bottom level conversations (e.g. those in which the targets of the hate are asked to justify their existence or presence or right to have opinions repeatedly) over and over with people who aren't really interested in intellectual discussions or solving actual problems or making art that isn't about hatred.
But yeah, as with anything involving large groups of people, these things get complicated and can be unpredictable.
Thank you! Even on lemmy I find the atmosphere often oblivious or ignorant to marginalized views. The majority here are cis men (regarding the poll earlier this year) and it certainly shows. And the people here are probably mostly left-leaning? So I definitely couldn't imagine sharing a space with anyone more right-leaning than that.
It depends a lot on the instance IMO. I didn't like the attitude at lemmy.ml but I like it here at beehaw. I'm very left-leaning, progressive and LGBTQ+ friendly.
Lemmy.ml and lemmy.world are more right-leaning as far as I can see.
Yes sure, beehaw is more progressive. But still I sometimes don't feel so comfortable in its community because it can at times feel very male-centered.
"you have to nip it in the bud immediately. These guys come in and it's always a nice, polite one. And you serve them because you don't want to cause a scene. And then they become a regular and after awhile they bring a friend. And that dude is cool too.
And then THEY bring friends and the friends bring friends and they stop being cool and then you realize, oh shit, this is a Nazi bar now. And it's too late because they're entrenched and if you try to kick them out, they cause a PROBLEM. So you have to shut them down.
What evidence did you find to support Substackâs claims? They didnât share any.
You can quickly and easily find good evidence for things like Reddit quarantining and the banning of folks like Alex Jones and Milo Yiannopoulos.
Which claims are empirical again?
Well it depends what you define as "the problem".
If you define it as Nazis existing per se, banning them does not "solve the problem" of nazis existing. They will just go elsewhere. A whole world war was not enough to get rid of them.
However, allowing them on mainstream platforms does make their views more prevalent to mainstream users and some might fall for their propaganda similar to the way people get attracted to the Qanon nonsense. So if you define the problem as "Nazis gaining attention" then yeah sure. It certainly does "solve the problem" to some degree. And I think this is the main problem these days (even in the Netherlands which is a fairly down to earth country, the fascists gained 24% of the votes in the last election!)
However however you define "the problem" making money off nazi propaganda is just simply very very bad form. And will lead to many mainstream users bugging out, and rightly so.
we also do know that going after nazis and white supremacists works since all through the 90s they were relegated to the fringe of the fringe corners on the internet.
I always hate policy talk trying to split the hairs of Nazism and âcalls for violenceâ.
Even worse, I just canât get allowing monetization. If you truly âhate the viewsâ, stop lining your pocket with their moneyâŚ
The only thing they hate is not taking their money.
There are too many of these goddamned social networks anyway. After Twitter/X exploded, everyone else wanted to grab a piece of that pie, and now we've got a dozen social networks nobody uses.
If you want a progressive social network that doesn't take shit from goosesteppers, Cohost is probably the place to go. It's so neurodivergent and trans-friendly that I can't imagine them blithely accepting Nazi content. It's just not how Cohost works. "Blah blah blah, free speech!" Not here, chumps. We've got standards. Go somewhere else to push that poison.
Substack started so well... It was looking like the new Medium (after medium totally enshittified). But the discovery was never very good there, and now this. Nope. Not going to blog there.
I wonder if Snowden still supports them.
probably since snowden thinks putin is amazing
Does he really? I think it's more like he got stuck there on his way to Ecuador and now he has no alternative but to "like" Putin :P
After all his plan was never to stay in Russia.
i mean "Edward Snowden gets Russian passport after swearing oath of allegiance. Whistleblower is âhappy and thankful to the Russian Federationâ for his citizenship, lawyer says"
I know.. But the point is, he's stuck there.
He had 2 choices:
It's not really like "free will" applies here :)
Whatever the lawyer said is just the minimum required decorum IMO. Just politics. The oath is probably simply required to get the passport.
Putin got to get one-over on the US and Snowden got to stay out of prison (well, in reality a really huge prison but still...). It's a marriage of convenience.
Does nobody remember the exact same dust-up the day after Substack launched?
So they are complacent with it very well. If you are complacent with Nazis, to me, you're a Nazi. I don't give a shit. What's the saying that the Germans have? Like there are six guys at a table in a bar and one of them is a Nazis, therefore there are six Nazis at the table? Yeah, that.
As always, there are several different aspects:
Sometimes the difference between "promotion", "laughing at", and "analysis", depends more on the reader's approach, that on the writer's intent.
Then again, sometimes a reader decides they don't want to deal with any of it, which is also respectable.
Look. If there are 9 people at a table sitting with 1 Nazi and hanging out, there are 10 Nazis.
That's the quick, easy, and wrong approach.
What are 9 non-Nazi people supposed to do: kick the 1 Nazi to a Nazi-only table? Leave the table and now have 2 Nazi-only tables? Get everyone thrown out?
Nazism works like any other sect; what converted people need, is exposure to other ways of thinking, ideally some human connections with people whom the sect demonizes and tries to keep members away from. Pushing sect members away from society, is precisely what the sect wants!
I'm not saying that you personally, or even me, should be the ones to do that, or that we should idly watch, or not have a Nazi-free table.
What I'm saying is that non-Nazis putting up with a Nazi in order to de-program them, should be praised, and that you can't tell what's really going on just by watching who sits where.
This is too complex and nuanced. You punch the nazi and throw them out of the establishment. That's it. That's all.
That's how you reinforce the indoctrination of "us vs. them", and how they're the oppressed ones. Far from "it" or "all".
Yes. I don't care if they feel oppressed. Tolerance does not extend to Nazis. They SHOULD feel oppressed. I will not entertain being soft on Nazis. I will not entertain "seeing both sides". I will not entertain Nazis.
I don't think you understand. Instilling a feeling of oppression, then a desire to "fight against the oppression", is how Nazis get created. Nobody's asking you to tolerate it, or see any "both sides", there are no sides. The only side is whether you work with or against their indoctrination.
All I'm saying is you better double check those 9 other people at that table, in case they're doing the hard work you or me don't want to.
I think I may not have made my position clear. If there is 1 Nazi at the table, and 9 people are just sitting with them hanging out... you have 10 Nazis. They are not "doing the hard work" otherwise it the 1 Nazi would not be sitting there. I'm hearing a lot of sympathizing towards Nazis in this thread.
You have made your position very clear, but if what you got from what I'm saying is "sympathizing towards Nazis", then you've missed my point completely, and I don't know how to make it any clearer. đ¤ˇ
I just want to state very emphatically that deradicalizing people is a specific skill set and set of actions that is completely different than âbeing friendly to nazisâ. And tolerating bigotry so that people donât feel bad about their bigotry is just tolerating bigotry. On that note, on another post you argued heavily with multiple users that white privilege is not real and that you were being oppressed for your whiteness. I thought maybe you were very young, or confused, and tried to have empathy and explain some concepts, but here you are now also arguing that we need to be nice to nazis for the good of society so that they donât feel oppressed. I suppose you might say that pointing this out and making you feel more oppressed would drive you further away, but a better approach, i think, would be to tell you very very directly that the things you have been saying here, in multiple places, are white supremacist talking points. And no one here is going to condone that. Stop. If you need help stopping, that is your responsibility, not the hypothetical 9 other peoplesâ.
How can you tell "deradicalizing" from "being friendly" by just seeing people sit at the same table? One of the strategies for deradicalization, is precisely having positive experiences with people the group/sect has vilified.
No. In a nazi-free area, just kick the nazi, that's easy. What I'm arguing is that you should give the benefit of the doubt to the other 9 people. Don't assume.
Let me clear that up:
These are not either-or.
Not very young, maybe confused, maybe living in a different society, most likely with different life experiences.
Thank you for trying to explain some concepts. I learned some stuff in that other post, and I'm grateful for that (even if the conclusion was depressing).
Is this another "learn the book of forbidden words" situation?
I refused to read into white supremacist propaganda any further than seeing their basic manipulation strategies, and that was a few decades ago. Are you asking me to read the updated version?
Over the past several days Iâve seen you draw out many good faith disagreements about racism or nazism into what seem like intentionally blurry âjust asking questionsâ type derailments whereby you try to shift the topic of the discussion to other, emotional or tangential details and or try to misrepresent the issue at hand to make the racism or nazism seem not that bad. I really donât think someone would do that if they were coming from a place of genuine confusion or curiosity or dialogue. I might be wrong, but taken together it really gives the impression, intent aside, that youâre trying to spin up plausible arguments for far right stuff and then sow confusion whenever people say âhey, donât do that, itâs harmfulâ. I just donât believe thereâs wiggle room here. I donât want to have a circular conversation about it, but i do want to point out directly what youâre doing, because I think it sucks, and I think that you should stop.
Ok, let's be direct: I'm against nazism, racism, sexism, pretty much the concept of -isms itself, and a techno-anarcho-communist at heart. I generally try to avoid putting too much of my own bias into things, and I do have a tendency to focus on exploring a single aspect of an argument (you could call it "tangential details")... but if you see me use anything resembling "far right arguments", then it means I've gone too far and I will be grateful if you, or anyone else, call me out on it.
Is that acceptable?
I think I've only pulled one emotional tangent, just because it's impacting me personally right now. It's hard to be objective about that. But I found the following discussion educative, so... I'm serious: thanks everyone for answering.
PS: merry holidays đđĽ
This does not appear to me at all what is happening, at least in this thread, and I would even go as far as to call it gaslighting.
The other user literally said if 10 people are at a table and 1 is a Nazi, then all 10 are Nazis. They have also labelled any opposing view as "sympathizing towards Nazis" in another comment. That is pretty damn fucking far from good faith. And yet, somehow, because this other user pointed out the problem with this type of thinking, you are now accusing them of not being good faith? Are you serious? People are refusing to have any kind of nuanced view of the situation, accusing everyone in that situation of being a Nazi and people who disagree of being sympathizers, but somehow the other person is the one not acting in good faith, or using emotional arguments?
I really don't want to be rude, but your comment reads like textbook projection. They also never said anything to defend Nazis or the far right, not once (*), so that makes you the one who is misrepresenting what they are saying and doing. I encourage you to keep everything you said in mind, but re-read the thread through a more objective lens.
I really didn't want to get involved in this conversation, but some of these comments really frustrated me, and yours was just the straw that broke the camel's back; I had to let some of the frustration out. If you just want to ignore me, that's fine.
(*) At least as far as this discussion is concerned; I do not have an all seeing eye.
The other user is right. Any tolerance for those ideologies gives them a foothold. It makes room for them. There can be no room, at all, for that shit. Ever. Arguing otherwise is dangerous. There is no nuance here. At all.
All areas should be nazi-free areas. If any of them are truly attempting to de-radicalize someone they would know that there is a time and place for it, and out in general society is not it.
This is very reductive, dismissive and exactly the kind of thing a white supremacist would say to try and justify saying something shitty. Which is exactly PotentiallyAnApricot's point. I could chock it up to naivety, but just as an outside observer on this thread and others, I don't think it is.
Dismantling systemic oppression personally, interpersonally, and in greater society is a constant process. Especially as a white person. Saying you did some research decades ago and are all good now is not how it works.
Now, this is reductive and dismissive. "Tow it outside the environment" is not an option, you can only create reservation, concentration, and general areas.
Part of de-radicalization is reinsertion into general society. Can't be done outside. Studies have shown that de-radicalization actually lags way behind reinsertion. What you're proposing are lifelong reeducation camps.
The what? Not sure if you realize, but stuff like this "Especially as a [whatever]" is what pushes people over the line.
This is not what I said.
It is very much an option, and one that works. You can't have an inclusive society if you accept members that want other members dead.
https://en.m.wikipedia.org/wiki/Paradox_of_tolerance
After they are no longer a nazi. If they are still a nazi, they belong outside society.
Ok, to be be less specific, especially as the people that benefit from systemic oppression we have to keep more on it than those do not benefit from it. I used white people because we are currently the largest benefactors of systemic oppression. If that "pushes you over the line" you were too close to line to begin with.
The meaning most people get from what you said is that you learned how white supremacists manipulate people decades ago, then disengaged. Again, in order to keep getting better, you have to keep learning, and for fucking sure chuds will keep trying to distract and minimize.
i never used substack before and they are doing a good job making sure i never do. i hope they like being the nazi bar.
đ¤ I'm a bot that provides automatic summaries for articles: ::: spoiler Click here to see the summary While McKenzie offers no evidence to back these ideas, this tracks with the companyâs previous stance on taking a hands-off approach to moderation.
In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions.
âWeâre not going to get into specific âwould you or wonât youâ content moderation questionsâ over the issue of overt racism being published on the platform, Best said.
In a 2020 letter from Substack leaders, including Best and McKenzie, the company wrote, âWe just disagree with those who would seek to tightly constrain the bounds of acceptable discourse.â
The Atlantic also pointed out an episode of McKenzieâs podcast with a guest, Richard Hanania, who has published racist views under a pseudonym.
McKenzie does, however, cite another Substack author who describes its approach to extremism as one that is âworking the best.â What itâs being compared to, or by what measure, is left up to the readerâs interpretation.
Saved 57% of original text. :::