Why is youtube recommending conservative "talking points" to me?

V01t45@lemmy.fmhy.ml to Asklemmy@lemmy.ml – 481 points –

Hi, I am a guy in early thirties with a wife and two kids and whenever I go on youtube it always suggests conservative things like guys dunking on women, ben shapiro reacting to some bullshit, joe rogan, all the works. I almost never allow it to go to that type of video and when I do it is either by accident or by curiosity. My interest are gaming, standup comedy, memes, react videos, metalurgy, machining, blacksmithing, with occasional songs and videos about funny bullshit and I am not from america and I consider myself pretty liberal if I had to put it into terms used in america. But european liberal, so by american standards a socialist. Why does it recommend this shit to me, is this some kind of vector for radicalization of guys in my category? Do you have similar experience?

186

You most probably viewed these types of videos a few times and the algorithm started recommending them to you. It only takes a couple of videos for the algorithm to start recommending video of the same topic. You could easily solve this by clicking on the 3 dots next to the video and then selecting "Not interested", do it enough times and they'll be gone from your feed.

This, the algorithm doesn't care whether you like enjoy it or not, it cares whether you engage with it or not. Even dislikes are engagement.

I'm not talking about liking the video, I'm talking about viewing the video. Viewing the video, especially a large part of it, will contribute to the watch time, therefore affecting the algorithm's recommendations.

It's also kinda dodgy what YouTube considers "viewing" a video. 3 seconds of autoplay? You viewed it. Misclick an ad trying to skip? Viewed it. AndroidTV previewed the video when you left it highlighted for too long? You guessed it, viewed.

Fortunately figuring out what it considers you to have viewed is pretty straightforward: if it's in your watch history it considers it viewed. You can also go through that and manually purge things.

Of course there's still the question of why YouTube will so consistently recommend videos that get people started down this downhill slide of absolute dogshit atrocious content.

If you don't disable the feature where it plays the videos directly in the thumbnail by hovering over it, it also counts as viewed if you let the thumbnail play for like 3 seconds. I hate it.

I meant like as in "enjoy" not as in click the thumbs up button, sorry for the confusion. Even a video that pisses you off, that you thumb it down along with every comment is considered engagement and it not only feeds you more but also boost the video as a whole.

All it cares is whether it keeps you on YouTube. It's like news, outrage is profitable even if everyone hates it.

I am constantly bombarded with Jordan Peterson videos despite disliking them and telling the algorithm to show less like this.

Iā€™m not sure how it profiles people, but it sucks.

Disliking is engagement. Instead tap the three dots and select "not interested"

Go into your watch history and remove them after disliking it. I find removing stuff from my watch history to have a bigger impact over disliking stuff. You can also have YouTube stop recommending specific channels to you. Odds are if they're posting Jordan Peterson content, you're not missing anything of value by blocking them

The thing is Iā€™m not sure what Iā€™ve watched thatā€™s triggering the JP spam.

Maybe I need to just nuke my entire history and start again.

If you leave youtube/close the app everytime Jordan Peterson opens his big mouth, the algorithm will get the idea quick enough

I donā€™t know how much a Not Interested designation does. Iā€™ve done that for channels where I donā€™t want to see THAT video, but still like the channel in general.

What I do is just block the channel: if this is what the channel wants to espouse then Iā€™ll do better by eliminating it entirely.

Yeah, I get those too. Am always a little surprised, but I'll be honest I'm interested in knowing what the other bubble is telling themselves... If it gets to be too much I'll click not interested, but right now it's just a bizarre break in my feed.

Edit: Just remembered this quote that fits "I'm fascinated by trash TV. The poet must not avert his eyes" - Werner Herzog

You don't even have to watch those types of videos. I got watching Star Trek lore videos etc. And got off on to one channel in particular. Whose videos always ended up devolving into rants against SJW and leftists. This from someone who posts lore videos on a show about luxury gay space communism. But simply because of apparently a large portion of his viewership also engages with the bigotry and hatred. I started getting tons of recommendations for shit that I have never watched and would never watch.

But yes Mark not interested and delete from history was the best way to get it out.

My shorts is the worst. Regular seems fine. Especially the pods dunking on women. "So who should pay on the first date?".....

1 more...

If I accidently watch a Linus Tech tips video, that will be all it recommends me for the next month.

I watched a Some More News video criticizing Jordan Peterson, and Google thought "did I hear Jordan Peterson? Well in that case, here's 5 of his videos!"

Almost all content algorithms are hot garbage that are not interested in serving you what you want, just what makes money. It always ends up serving right wing nut jobs because that conspiracy theorists watch a lot of scam videos.

Edit: my little jab at Linus has nothing to do with politics. I have no idea what his views are. I only mentioned it to point out how YouTube will annihilate my recommendations if I watch a single one of his videos.

I watch Linus from time to time, but donā€™t get that sort of recommendation (unless I watch some gun videos!). I only watch his tech stuff and donā€™t know anything about his politics. Now Iā€™m worried.

Oh I didn't want to imply that Linus puts out political opinions. Of the few videos I've seen of his it's all tech hype videos. I was only giving an example of the algorithm deciding to nuke my recommendations if I watch one of his videos.

I watch Linus all the time and almost never get these recommendations. Might be a certain combination of interests?

I can confirm, I get pushed Linus hard. I watched like 3 or 4 of his PC build videos one time a while back. Never clicked on a WAN Show episode (his podcast). But now if I let just about any gaming/tech video roll to next, I get served entire podcast episodes of his like 25% of the time. I never asked for this, I always click off, they keep coming back. I'm not mad really just bewildered. Idk the retention must be there

I used to watch LTT a ton. He doesn't disclose his political opinions, but from watching years of his live streams, it's pretty clear he fits the PNW Canadian + American metro demographic quite well. Basically Seattle/Vancouver, you probably get the gist of it.

To be fair, you did watch a 3 hour JP video...

That's my point, the algorithm doesn't understand context.

It sees you like standup comedy (POTENTIALLY ANTI-WOKE) and gaming (POTENTIALLY INCEL) and in your 30s (LIKELY HAS SOME DISPOSABLE INCOME)

So it's pipelining you to the annals of YouTube that check those boxes, have very good viewer retention and have good user engagement.

The ones with good user engagement are the ones that raise your blood pressure. They make you angry. So, politics, or dirty cops, interrogations, murder stories, stuff like that.

You can resist them all you want, but stuff that makes people angry and chatty and commenty and re-watchy is like cocaine to the algorithm. It makes them money, so they spam it everywhere it even remotely makes sense to try to get you stuck in their quicksand.

This is the real answer. It's easy to forget that for most people who are famous for their unusual political views, most of their overall content has nothing to do with that. There's something about politics that can turn even the most open-minded of individuals into raging idealogues.

It's hilarious to me that Joe Rogan is now known for his like 3 conservative views when I mostly remember him as the guy who hosted Fear Factor, did every drug known to man, interviewed scientists in every field out there, and did that really popular interview with Bernie Sanders a couple years ago.

The point is that every hobby and niche interest has someone who gets way too hung up on one particular issue and devotes way too much time to talking about it, dragging the whole community down with them.

Because 'conservative' content gets a lot of engagement (ie ad money). The more they recommend it the bigger the audience, the bigger ad payout. They're literally monetizing hate.

Maybe you have the same problem I have: my wife is still a republican. When that kind of stuff shows up, I know she has been watching it on the family PC. She's not that tech savvy, so I usually go in later and block or limit some of it. It's a pain to fight the algorithms.

As a woman, I can never understand how other women can be Republican.

My mom was a Dem growing up, but then she fell down the religious rabbit hole after I left home and it was all downhill from there.

My little sister though? I have zero idea how she ended up Republican. It's fucking bizarre. šŸ¤·ā€ā™€ļø

ā€œThe Leopards wonā€™t eat my faceā€

Its the "Leopards eating people's faces party"; not the " Leopards eating my face party". GOSH!

PS - is that a community here yet?

I agree with and get the point of the metaphor. But the leopard metaphor always seemed a bit tortured to me. Thereā€™s got to be a cleaner way to communicate that point.

You'd think so; and yet, so many people that metaphor is directed at don't even realize...

In our case, we both grew up in a conservative state with conservative parents. But over time, I drifted away from it (and it drifted in crazy land). However she still hangs on. She's come around on a few things. She often falls into the false equivalence mindset that both sides are just as bad. And whenever she starts to bring up any anti-vax or Joe Rogan BS, I try to stomp on that real quick. But I know she will always vote for anyone with an R by their name when it comes down to it.

My thoughts. It never starts as full blown conservatism but reaches that point by a branching topic. My brother was always financially conservative, boot straps and all that, but over the last 12-18 months as the media he consumes moves further right he has become more socially conservative, because these people were inline with his thinking before so they must be right now of course.

I caught him going on about trans people corrupting his kids a couple of weeks ago. My brother might be a dick but he wouldn't have been saying that 5-10 years ago.

1 more...

Could probably use another account with a different profile and just switch whenever you use the pc

This is why I inist that each person have their own account on a shared pc.

2 more...

I get them all the time. Pro-gun crap, Jordon Peterson, Joe Rogan, and other trash. I constantly hit ā€œdonā€™t recommend this channelā€ or ā€œshow me less of thisā€ but they still come. I honestly think Youtube is getting paid off.

Anger drives engagement because your m more likely to comment when you're angry. If you're fine you probably just move on. So the algorithm prioritises stuff that makes people angry.

Conservative proganda is a highly coordinated and well-funded network. They pay for preference.

It's not that so much as Howard Stern theory. The people that like that stuff, click on it and move on.

The people that hate that stuff hate it enough to interact with it... That means more time on the site, more ads, more revenue.

It's just the hate algorithms... They're lucrative.

Machining and blacksmithing are highly correlated with right wing BS in the US. Check my uname and ask me how I know. šŸ˜

This one may not be so obvious, but I've seen someone lose their marbles over the crazy idea that they get recommended stuff for children when all they watch on YouTube is Minecraft videos and sometimes Roblox. And people in the comments agreed...

The YouTube subreddit wasn't full of bright people.

For me if I ever look up warhammer 40k it immediately starts sending me losers like quartering or other channels like him.

Like, bo youtube, I don't want to hear about how feminism and "wokes" are ruining warhammer. I also don't want to be sent Sargon of Carl videos. Uhg. Lmao

My interest are gaming, standup comedy, memes, react videos, metalurgy, machining, blacksmithing,

Because a lot of the type of guys who like seeing those stupid conservative videos also like many of the same things as you. Gaming^(TM) is well known to have a problem with the alt-right, react videos have a very similar structure to conservative ā€œlibs destroyed with fact and logicā€ types of videos, and finally a lot of conservatives like to think of themselves as an old fashioned manā€™s-man so they enjoy things like metalwork and other typical ā€œmanlyā€ careers.

Pretty sure this is it - I see them as well but my interests are science education, gaming, and 3D printing.

Not to be that guy, but you can say "don't recommend this channel again" to YouTube. I haven't seen Quartering, Asmogold, etc for years now. Unless you search for them.

I had the issue that I got Andrew Tate shit recommended. I said don't recommend that, and block the uploader. Youtube still suggested me that video. Exactly that video.

1 more...

I've only been recommended weird stuff like this maybe twice a year, but maybe that's because I've used YouTube since it was new, and it learned my tastes before all this dodgy stuff showed up at all. Maybe I got ahead of it?

Now I mostly just get recommendations for niche electronic projects, dumb skits, and callmekevin. Just how I like it šŸ˜Ž

2 more...

Your interests have a strong correlation with people on the right aside from maybe react videos.

But even if your interested were not so strongly correlated with the right, you would probably still get right wing ads or videos suggested. They garner the highest engagement because it is often outrage porn. Google gets their money that way. My subscriptions are to let wing political channels, science, and solar channels but I still get a decent amount of PragerU and Matt Walsh ads. Reporting then does not stop them from popping up either.

Yeah big tech loves to throw dumb stuff your way to piss you off and keep you engaged, even if you've never shown an interest before.

Algorithms. It's why I use newpipe on mobile and freetube on desktop. Both allow me to import and export my subscription list, and block ads and have sponsorblock support.

But, most importantly it doesn't make use of the Google algorithm to try and recommend me videos that might be of the radical nature on misinterpretation of past viewing history.

It's because you are a guy in your 30s. You (and me) are in a shitty demographic so the algorithm peddles us shit. It doesn't matter how many times I list shitheel vids by the ppl you me mentioned as "do not recommend channel", I will always have Rogan or Peterson popping up in my feed to ruin my day. It drives me crazy

It can't just be that though, I'm a guy in my 30s and I never get any of these channels recommended to me, just videos by channels I've subscribed to or similar to what I've been watching, even after watching political videos I don't get those shysters recommended to me.

2 more...
3 more...

One thing I noticed about browsing the youtube homepage on PC is if your mouse hovers over a video, it starts playing, and that puts it in your watch history. So you might be accidentally adding a trash video it recommended to your watch history while looking at the other offerings. You can disable the "mouse hover auto play" by clicking your profile pic in the top right > settings > playback and performance > inline playback.

You've probably watched the other side quite often, so YouTube is recommending this for ya to enrage you and keep you engaged. Just good old rage bating

The devils algorithm.

Clear your history! I limit a lot of what Google collects. If I see videos trending in that direction, I clear youtube and start fresh.

I think it might be the opposite. If extreme content is kind of "default" when the algorithm doesn't know what to give you, them starving it off history might push it into that default more often. I have a very used YouTube account with a metric ton of history and honestly I very rarely see that kind of content. (From Europe though so it might be different)

What I'll do is clear history, then watch something to seed it as a baseline. Eg, one video on low fi music, and another one on how an 18th century naval ship was built. Or perhaps they've just captured all my data secretly and know that extremist videos perform poorly with me?

You can have Google clear things automatically so it'll only keep things for a certain amount of months.

Without any history Google often shows pretty extreme videos, so in my experience the lack of any history is not good.

You need to train your algorithm.

When you hover over those videos there will be three dots in the lower right hand corner use either not interested or don't recommend this channel to clear the right wing trash from your feed.

On videos you do like you need to do some engagement, watch complete videos, thumbs up (or down it doesn't matter) and comment.

Do that for a few days and your feed should clear up.

And make sure to turn off The setting that auto plays the videos in browser if you mouse over them (don't remember the setting's name). Hovering your mouse over it counts as you watching/playing the video if the setting is on.

3 more...

I've been vegetarian since 2009 and in the past 2-3 years I keep getting all kinds of bow hunting videos.

It's particularly amusing when my plan was to watch an old episode of "The Golden Girls."

They REALLY misread my demographic.

I'm 30 and have a small family, too. When I watch shorts on YouTube I get the exact same content you're describing. None of the long videos I'm watching are political, yet the Algo keeps throwing them at me. I get a lot of Jordan Peterson crap or lil Wayne explaining how there's no racism. I hate it.

The lil Wayne stuff is so strange. Usually when some specific celebrity pops up in my feed I assume their publicist is rehabbing their image after some public incident or recently exposed private conflict. YouTube shorts isn't like that.

YouTube shorts seems to be exclusively just a pipeline to rightwing talking points and the unfunniest parts of stand up comedy framed to serve the same rightwing pipeline.

A while back the conservative party of Canada was caught inserting MGTOW / Ben Shapiro tags on all their Youtube uploads. In other words they were poisoning peoples social graph in order to cause exactly what you're talking about.

This is why I use a Google account that is only for Youtube entertainment. I keep it on a separate chromium profile. I turn on all the privacy toggles in the Google account. Only Youtube history is turned on. I curate the watch history.

You cannot tell what content might have breadcrumbs that eventually open the floodgates of far right echo chambers. They do this intentionally. So it requires active measures on your part to counter them. You've got to manage your account with intention. I do not use that account at all for random browsing. I usually do that in incognito on a different browser.

Conseratives and fascists are the same group, so I'll refer to them as fascists.

You are talking about one of the core criticisms of corporate secret algorithms to determine what to influence you with. Fascism is forced to creep into everyones world view when you use standard social media, and the average person wouldn't have the slightest idea. Certain key things will be more related to fascist content, like philosophy, psychology, guns, comedy. If you think about what fascists enjoy, or what they need to slander then it makes what I said make more sense.

Jordan Peterson does a lot of vids around psych/philosophy to redirect curious people to false answers that are close to true but more agreeable for fascists. An example of a psychological cooption is "mass psychosis" being coopted into "mass formation psychosis" by fascists. Mass psychosis explains too many true things, where mass formation psychosis redirects people towards a more palletable direction for them.

This is why I want to be nowhere near corporate media if possible. If you delete your cookies(or private browse for the same effect) then youtube will promote the most adjacent things to what you watch like old youtube used to do, although it'll still promote fascism when directly adjacent. With cookies though they have an excuse to have questionable content linger statistically too often.

You actually believe that every single conservative is a fascist? Jesus Christ political literacy is dead

Maybe you need to go have a read about fascism, then tell me, are the points presented becoming more or less similar to current conservative views?

Oh, I don't deny that a lot of Republicans are becoming fascists, but that is far from "all conservatives are fascists".

Nooo, i finished my comment to you and then upvoted you while proof reading >_> turns out that kills the comment.. ugh..

If i could rewrite my first comment to just talk about conservatives then i would have. I'd rather focus on active threats of evil corp shenanigans.

I agree that republicans are becoming fascists in the overt sense. My views are based more on that they are unknowingly fascist by actions, not awareness. Conservative policy in a nutshell is really just "We good, You bad" and power plays without any rational explanation. Can you really name any decent policies conservatives have? All i hear is hate, scapegoating, intensification of totalitarian power structures, increasing control for corporations, war and guns. Those things were key in Nazi Germany too.

I just use the term capitalism since there is no difference in the goals of either.

Not necessarily in all countries, but in the USA I'd say all Republicans are fascist, or at least complicit in the spread of fascism. The conservative wing of the Dems, nah.

Because the algorithms favor alt right garbage heaps and the companies will never bother to fix them. Hence why we need some regulations.

Let's be honest. The companies only really care about "engagement" and the Right is more than happy to provide that.

I'm not sure exactly what you mean about regulations, but I think regulating what political views can and can't be seen on the internet is a very slippery slope. I completely agree that their viewpoint is garbage, however, countries like China and Russia made laws with the same justification, and now they completely control the media in their countries and control the political views of their citizens because of it. That said, as much as I distrust my government, I don't believe they are as bad as Russia or China. However, a day may come where they are just as bad. And if that day comes, I don't want them to be able to control what political views we are allowed to put on the internet.

I very rarely use YouTube, but I've noticed that when I start listening to older top 40 or contemporary bluegrass, those videos drop in

Youtube really said "you're old, you must be racist" lmao

"Hi, I'm a guy in early thirties--"

Well there's your answer.

Those types of videos have the most engagement. YouTube is trying to show you whatever it thinks will keep you there longer.

Turns out conservative radicalization keeps people longer there. Iā€™ve never googled or watched any Andrew Tate videos but my recommended has at least 3 videos front and center.

If you watch any kind of gaming videos, and haven't trained your algorithm, then you'll get flooded with this shit

I bought a new phone a few months back, and just for shits and giggles I tried looking at YouTube Shorts without any account logged in. I clicked on a food Shorts, and within 5 Shorts, I got a "manosphere" video, and within 10 Shorts I got Ben Shapiro.

Unfortunately, fear and rage drive engagement, and these Conservative grifters are more than happy to give it to them and more. That on top of corporations being Capitalists and thus right-wing by default.

you are a closet conservative. the algoritham has spoken.

A few years ago I downloaded a browser extension to stop showing me recommended videos, both on the homepage and on the side of videos. I can only watch what I'm subscribed to and what I search for - you'd think its a big sacrifice because you can't discover as many videos, but in reality I've gained so much more of my time back and control over what I actually want to watch.

metalurgy, machining, blacksmithing

That could be it. I don't know YouTube's algorithm, but typically they work by finding what other users watch the videos you watch and recommending you other videos those people also watched. I wouldn't be surprised if the guys watching blacksmithing videos also tend to watch Joe Rogan and the like.

Iā€™d argue that gaming and memes are much bigger contributors.

I almost never allow it

The times you do allow it are all the algorithm cares about, sadly. Any kind of engagement is great for companies.

"Hate Rogan? Cool, watch some Rogan as hateporn, hate watching is still watching."

Hi, I am a guy in early thirties with a wife and two kids and whenever I go on youtube it always suggests conservative things like guys dunking on women

Yeah it really wants me to watch Tim pool. Its like cmon guys I'm too old for that guys tastes. Tiktok will atleast take the hint

You can see what Google (thinks it) knows about you.

  • Go to your Google Account (https://myaccount.google.com)
  • Manage your Google Account
  • Select Privacy and personalisation.
  • Under this Data & privacy page you'll find History Settings, Ad Settings, and more.
  • For example, go to Ad Settings and click on Ad Personalization.
  • Now you'll see How your ads are personalized.

I think you can even remove stuff if you want.

The algorithm is clever enough to know that people that watch a few of those videos are likely to watch a whole lot more. So itā€™s good business to recommend them as often as possible. If they CAN convince you to dive into that, the stats are that you will start to watch a ton more YouTube content.

I'll see major swings in the algorithm from time to time (a week of bass guitar video recommendations for some reason), but I can usually trace back a video or two that I watched that it just decided to try and cram the topic down my throat. I would just say make sure you keep avoiding the conservative vids to try and get the recommendations to stop. I wonder if maybe one of your kids is watching videos on your account and skewing it?

It can also be linked based where it recommends videos that conservatives have enjoyed that seem unrelated. Like maybe watching home improvement videos and historical war videos then the algorithm feeding new content that seems unrelated but actually is according to it, since it's trying to expose you to new interests to expand the time you spend there.

Best I found is just not using YouTube with an account and having cookies cleared on exit, or using newpipe (Android) and freetube (desktop) which let's you have an accountless subscription feed with Adblock and sponsorblock support.

(a week of bass guitar video recommendations for some reason)

Sometimes I wish youtube's algorithm were sentient so I could shake it and ask why the fuck--

For me it was a week of growing your own backyard basil. Not herb gardens, not vegetables, not self-sufficiency in general. Literally just basil. I don't even own a single plant.

I think such content gets most engagement. Dunking on leftist ideas brings right wingers celebrating and parroting the piece while pissed left wingers trying to explain why the argument doesn't make sense.

The algorithm wants engagement first and foremost (positive vs negative is irrelevant), after that it wants to push view points that preserve the status quo since change is scary to shareholders. So of course capitalist/fascist propaganda is preferred especially if the host is wrong about basic facts (being wrong drives engagement.)

I get right wing stuff only on YouTube shorts typically. And I think it's because I'll watch them. I find it interesting in a detached sense.

Good to know what you're up against. Same reason I try and watch as many Trump speeches as I reasonably can.

Because controversy makes money and conservatism is filled with controversial opinions and purposely obtuse takes intended to spark conversation and promote divisiveness. Thatā€™s the grift.

machining

No idea about your algorithm problems but have you seen the Cutting Edge Engineering Australia channel? It's so good.

You can view what Google "knows" about you on your account settings. I made my account when I was very young, I lied about my age and my gender, then it made assumptions based on my interests of my professional situation. I guess many people in my gender and age group, sharing my actual interests (tech, movies, culture, food) are also interested in the kind of content you described (Joe Rogan, Jordan Peterson, Yiannopolous, etc). I keep clicking "not interested", but the algorithm keep suggesting these videos to me. I don't mind that Google doesn't know my politics. I'm a feminist, but there's really not a lot of interesting discourse about feminism on Youtube, so I just read and attend real life lectures instead.

I would wager a guess that it's your regular interests. YouTube sees that people who like machining, blacksmithing, etc. have a good chance of also being conservative. You probably are just part of the odd cases where you like those hobbies but aren't conservative.

Your post raises an interesting point, though: even if YouTube didn't intend for their algorithm to be a pipeline for radicalism, simply by encouraging engagement and viewership, their algorithm ends up becoming a radicalization pipeline anyways.

If they piss you off, you will stay on their platform longer, and they make more money.

That is the sad truth of EVERY social network.

Lemmy might not be that advanced yet, but as soon as they get big enough to need ads to pay for bandwidth and storage, soon after they will add algorithms that will show you stuff that pisses you off.

One way to combat this is to take a break from the site. Usually after a week, when you come back it will be better for a while.

I think it has more to do with the stuff you watch than wanting to piss you off.

All YouTube recommends to me are videos of kpop, dog grooming, Kitten Lady, and some Friesian horse stable that went across my feed once. Oh, and some historical sewing stuff.

If they started recommended stuff that pissed me off, I wouldn't bother going back except for direct videos linked from elsewhere.

Edit: Rereading what OP said they watch, their interests are primary interests of the right wing in the US. If they don't train the algorithm they don't want it, the algorithm doesn't know that those interests don't intersect.

2 more...

Like others have said, the things you watch are prime interests for right wing in the US. You have to train the algorithm that you don't want it.

I think the algorithm is so distorted by right-wing engagement that it will end up recommend right-wing content, even if you actively try to avoid it. I watch youtube shorts and I always skip if it's Shapiro, Peterson, Tate or Pierce Morgan. I also skip the moment I feel like the shorts might be right-wing. Scroll enough and eventually the algorithm will go "How about some Shapiro, Peterson, Tate or Morgan?" Give it enough time and it will always try to feed you right-wing content.

I suppose if you do nothing but scroll YouTube endlessly, it just starts to recommend anything to keep you in the platform.

But I've had an account since they started and even watch The Young Turks and C-Span here and there and almost never get anything political, let alone right wing.

My algorithm is at the point where I get Korean commercials, which is honestly fine with me.

Idk about that tbh , I have an account , and my algo is pretty trained , and there are times I have even told youtube to not suggest me peterson and shapiro stuff ! I moslty watch leftist content ! But i still get ben shapiro reacting to barbie and what not ( I am not even an american)

The best way to tune the algorithm on Youtube is to aggressively prune your watch/search history.
Even just one "stereotypical" video can cause your recommendations to go to shit.

This can't be any more clearer

Android TV- install smartnexttube Android phone- Set private DNS to Nextdns and block all ads + install firefox w/ unlock origin Or if you want an app install - newpipe from fdroid app store

Follow up question: why wonā€™t YouTube Shorts trust me when I say I donā€™t want to be recommended channels.

Soā€¦manyā€¦dancingā€¦trends. And they keep sending more.

YouTube algorithm only cares about engagement not likes or dislikes. It has a neutral impression of likes and dislikes and only cares if people are actively leaving impressions. Whether it's from liking with joy or disliking with anger engagement is a sign to show more. I've heard contrary to logic it's better to just skip immediately and not press anything signifying reception to the content shown, since it'll perceive it as recommendation to show more.

Yeah I can understand that for likes/dislikes or comments of ā€œthis is dumbā€, but after hundreds of ā€œdo not recommend this channelā€, the algorithm should be able to tell a lack of interest in a particular content.

You did not need to watch something to be bombarded with similar content, Youtube recommends things that are watched by ppl that watch things you watch (sorry about that). And it seems to considerate the overall popularity, at leat for me, so it usually recommends stupid popular right wing things just because it is overall popular and happens to be watched by a lot ppl that also watch dota 2 for example. I had to disable YouTube tu use my history to suggest content, my front page is full of things that I already watched from my subscriptions but for its better than YouTube stupid suggestions.

Yeah. I never really watched shorts, but I recently started watching them and I get tons of right-wing/red-pill shit in there despite clicking on "dont recommend this channel" (or whatever it says) on like 50 of those types of channels.

I used to get reasonable content as per my interests. Now after having a kid its just baby songs and bluey. I use smarttubenext on a firestick so at least I don't get ads but it's not great.

Could be paid promotion. I get a lot of suggestions in my feed for some really awful music in genres that I never listen to. I wouldnā€™t be surprised if the record label is paying to put it there.

You need to learn how to properly prune your feed. I got some of that stuff briefly, but I kept blocking it and choosing not interested and eventually it stops. My feed shows me nothing I dont want now. Its just a matter of shaping it into what you want.

If I use a private window, and don't log in I get a lot of right-wing stuff. I've noticed it probably depends on IP/location as well. If at work, youtube seems recommend me things other people at the office listen to.

If I'm logged in, I only get occasional right-wing recommendations interspersed with the left-wing stuff I typically like. About 1/20 videos are right-wing.

YouTube Shorts is different. It's almost all thirst-traps and right-wing, hustle culture stuff for me.

It could also be because a lot of the people who watch the same videos you do tend to also watch right-wing stuff.

In general, the algorithm tries to boost the stuff that maximizes "engagement," which is usually outrage-type stuff.

Why does it recommend this shit to me

Because you, or someone using your account, has watched this type of shit in the past.

Not necessarily, although YouTube shorts may be it's own thing in terms of algorithm I frequently see andrew tate, ben shapiro, jordan peterdon clips despite disliking and inmediatly scrolling past when I see their faces. Also I have encountered a lot of anti feminism content of the likes of 2014 this year, where someone is seen mocking "feminists" making what seems like stupid remark and getting owned with some sigma face meme from the american psycho guy and music.

it has been the case multiple times that the YouTube algorithm makes weird connections which often lead to right wing channels being promoted. Or sometimes an entire subsection of creatores are libked with the alt right without being direct (the old atheism sphere, gaming channels are common ones too)

It could be where you live too, maybe?

I live in the UK and never see ANY right wing stuff, even though we have our fair share of xenophobic nutjobs (some of them are even running our government's immigration department currently).

But I've heard that some places in the USA may be extremely polarised regionally, and so maybe that coupled with your other demographic information such as gender, age, occupation, etc, might be giving YouTube the idea that you probably fit the standard local mould for a right winger and thus might appreciate the same suggestions?

That's been my best guess whenever I see this kinda thing mentioned, though honestly it's just a guess! It always seems to be USA people suffering from it too :-(

Maybe the laws against hate speech, disinformation, etc, aren't as tight there and YouTube doesn't have as much incentive to hide that content? That seems less likely, but who knows! When I lived in the USA I saw soooo much more blatant flagrant lies in official emails, ads, media content and such, it seemed acceptable in a way it's not here.

I really dont think location is playing a role here, Im from chile but 100% of the content I consume on youtube is in english, the shorts have also appeared in english. I doubt this content is due to location as im not close to the US nor is there particular US cultural dominance in the area (most people consume youtube in spanish, be it creatos from latin america or Spain)

I live in the Southeastern US, never get bombarded with right-wing stuff on Youtube. My conclusion is if you're getting hit with that content, at some point you've watched that sort of content. Always click the 3 dots and click Don't Recommend Channel. It actually works, contrary to what people are saying here.

1 more...

I am guessing you probably viewed enough of these videos that YouTube's dumb algorithm is like, "Oh hey @V01t45@lemmy.fmhy.ml wants to see right wing stuff so let's show him that." I agree that it is very annoying. This is why we need to rally behind starting to use PeerTube and cancelling YouTube.

Social Media in general has that habit of furthered spreading far right content and dragging people into such content bubbles.

Even if you watch leftwing videos sometimes YouTube just goes "oh you're a young-ish male who's into politics? Here's som nazi vids you'll like." ESPECIALLY if the leftwing videos talk about righty's.

My YouTube recommendations are usually spot on, but I do get Joe Rogan videos sometimes. I could see people sliding into radicalizing garbage easily from there. Rogan's so big that he gets cool guests, but they're wasted on him as a host.

Fix the input not the algorithm - either disable watch history, or clear it of anyone who you don't trust their viewers to recommend positive channels.

If you watch something that turns out to worsen your experience, purge it from the view history, undo any likes and remove any comments.

and use the do not recommend chanel button

I'm convinced that button just reccomends them more. I've clicked it for quartering some 30 times before I got a browser plugin to just outright block channels.

And youtube still tries to send me his videos.

My "YTā„¢ experience" has gotten a lot better ever since I started avoiding it altogether and opted to watch videos through an alternative frontend. I do get a fairly different "popular" feed, but I mostly ignore that and go directly to my subscriptions feed instead.

You can get rid of a lot of the bullshit YouTube loves to shove down your throat by telling it not to recommend the channel. I haven't got any of that garbage in years

What state do you live in?

I know Iā€™m doing good WHEN I get served this kind of content. I donā€™t browse YT logged in, my browser always deletes cookies upon closure, and I run Ghostery/add block plugins.

Living in Texas, I kinda assume that is why I constantly get this right wing content (ads, vid recommendations). I figure my location is all they know about me and so that is why I get served this content.

It just tells me they donā€™t know anything about me so my countermeasures are working šŸ¤žšŸ¼

Just use Odysee, the YouTube algorithm is fully cooked from millions of hours of pure crap being posted every day.

Conspiracy theory: All the corps are working together to push the global politics to the right, because leftists dgaf about taking money from corps (as in theyā€™ll tax the corps & the rich).

you can try getting another advertisement I'd

search on duckduckgo how to