How to de-radicalize my mom's youtube algorithm?

driving_crooner@lemmy.eco.br to No Stupid Questions@lemmy.world – 1427 points –

She's almost 70, spend all day watching q-anon style of videos (but in Spanish) and every day she's anguished about something new, last week was asking us to start digging a nuclear shelter because Russia was dropped a nuclear bomb over Ukraine. Before that she was begging us to install reinforced doors because the indigenous population were about to invade the cities and kill everyone with poisonous arrows. I have access to her YouTube account and I'm trying to unsubscribe and report the videos, but the reccomended videos keep feeding her more crazy shit.

313

At this point I would set up a new account for her - I’ve found Youtube’s algorithm to be very… persistent.

Unfortunately, it's linked to the account she uses for her job.

You can make "brand accounts" on YouTube that are a completely different profile from the default account. She probably won't notice if you make one and switch her to it.

You'll probably want to spend some time using it for yourself secretly to curate the kind of non-radical content she'll want to see, and also set an identical profile picture on it so she doesn't notice. I would spend at least a week "breaking it in."

But once you've done that, you can probably switch to the brand account without logging her out of her Google account.

I love how we now have to monitor the content the generation that told us "Don't believe everything you see on the internet." watches like we would for children.

We can thank all that tetraethyllead gas that was pumping lead into the air from the 20s to the 70s. Everyone got a nice healthy dose of lead while they were young. Made 'em stupid.

OP's mom breathed nearly 20 years worth of polluted lead air straight from birth, and OP's grandmother had been breathing it for 33 years up until OP's mom was born. Probably not great for early development.

She’s going to seek this stuff out and the algorithm will keep feeding her. This isn’t just a YouTube problem, this is also a mom problem.

Delete watch history, find and watch nice channels and her other interests, log in to the account on a spare browser on your own phone periodically to make sure there's no repeat of what happened.

1 more...

I'm a bit disturbed how people's beliefs are literally shaped by an algorithm. Now I'm scared to watch Youtube because I might be inadvertently watching propaganda.

My personal opinion is that it's one of the first large cases of misalignment in ML models. I'm 90% certain that Google and other platforms have been for years already using ML models design for user history and data they have about him as an input, and what videos should they offer to him as an ouput, with the goal to maximize the time he spends watching videos (or on Facebook, etc).

And the models eventually found out that if you radicalize someone, isolate them into a conspiracy that will make him an outsider or a nutjob, and then provide a safe space and an echo-chamber on the platform, be it both facebook or youtube, the will eventually start spending most of the time there.

I think this subject was touched-upon in the Social Dillema movie, but given what is happening in the world and how it seems that the conspiracies and desinformations are getting more and more common and people more radicalized, I'm almost certain that the algorithms are to blame.

If youtube "Algorithm" is optimizing for watchtime then the most optimal solution is to make people addicted to youtube.

The most scary thing I think is to optimize the reward is not to recommend a good video but to reprogram a human to watch as much as possible

I think that making someone addicted to youtube would be harder, than simply slowly radicalizing them into a shunned echo chamber about a conspiracy theory. Because if you try to make someone addicted to youtube, they will still have an alternative in the real world, friends and families to return to.

But if you radicalize them into something that will make them seem like a nutjob, you don't have to compete with their surroundings - the only place where they understand them is on the youtube.

100% they're using ML, and 100% it found a strategy they didn't anticipate

The scariest part of it, though, is their willingness to continue using it despite the obvious consequences.

I think misalignment is not only likely to happen (for an eventual AGI), but likely to be embraced by the entities deploying them because the consequences may not impact them. Misalignment is relative

fuck, this is dark and almost awesome but not in a good way. I was thinking the fascist funnel was something of a deliberate thing, but may be these engagement algorithms have more to do with it than large shadow actors putting the funnels into place. Then there's the folks who will create any sort of content to game the algorithm and you've got a perfect trifecta of radicalization

Fascist movements and cult leaders long ago figured out the secret to engagement: keep people feeling threatened, play on their insecurities, blame others for all the problems in people's lives, use fear and hatred to cut them off from people outside the movement, make them feel like they have found a bunch of new friends, etc. Machine learning systems for optimizing engagement are dealing with the same human psychology, so they discover the same tricks to maximize engagement. Naturally, this leads to YouTube recommendations directing users towards fascist and cult content.

That's interesting. That it's almost a coincidence that fascists and engagement algorithms have similar methods to suck people in.

You watch this one thing out of curiosity, morbid curiosity, or by accident, and at the slightest poke the goddamned mindless algorithm starts throwing this shit at you.

The algorithm is "weaponized" for who screams the loudest, and I truly believe it started due to myopic incompetence/greed, not political malice. Which doesn't make it any better, as people don't know how to take care of themselves from this bombardment, but the corporations like to pretend that ~~they~~ people can, so they wash their hands for as long as they are able.

Then on top of this, the algorithm has been further weaponized by even more malicious actors who have figured out how to game the system.
That's how toxic meatheads like infowars and joe rogan get a huge bullhorn that reaches millions. "Huh... DMT experiences... sounds interesting", the format is entertaining... and before you know it, you are listening to anti-vax and qanon excrement, your mind starts to normalize the most outlandish things.

EDIT: a word, for clarity

Whenever I end up watching something from a bad channel I always delete it from my watch history, in case that affects my front page too.

I do that, too.

However I'm convinced that Youtube still has a "suggest list" bound to IP addresses. Quite often I'll have videos that other people in my household have watched suggested to me. While some of it can be explained by similar interests, but it happens a suspiciously often.

I can confirm the IP-based suggestions!

My hubs and I watch very different things. Him: photography equipment reviews, photography how to’s, and old, OLD movies. Me: Pathfinder 2e, quantum field theory/mechanics and Dip Your Car.

Yet we both see stuff in the other’s Suggestions of videos the other recently watched. There’s ZERO chance based on my watch history that without IP-based suggestions YT is going to think I’m interested in watching a Hasselblad DX2 unboxing. Same with him getting PBS Space Time’s suggestions.

Huh, I tried that. Still got recommended incel-videos for months after watching a moron "discuss" the Captain Marvel movie. Eventually went through and clicked "dont recommend this" on anything that showed on my frontpage, that helped.

My normal YT algorithm was ok, but shorts tries to pull me to the alt-right.
I had to block many channels to get a sane shorts algorythm.

"Do not recommend channel" really helps

It really does help. I've been heavily policing my Youtube feed for years and I can easily see when they make big changes to the algorithm because it tries to force feed me polarizing or lowest common denominator content. Shorts are incredibly quick to smother mebin rage bait and if you so much as linger on one of those videos too long, you're getting a cascade of alt-right bullshit shortly after.

Using Piped/Invidious/NewPipe/insert your preferred alternative frontend or patched client here (Youtube legal threats are empty, these are still operational) helps even more to show you only the content you have opted in to.

Reason and critical thinking is all the more important in this day and age. It's just no longer taught in schools. Some simple key skills like noticing fallacies or analogous reasoning, and you will find that your view on life is far more grounded and harder to shift

I think it's worth pointing out "no longer" is not a fair assessment since this is regularly an issue with older Americans.

I'm inclined to believe it was never taught in schools, and is probably more likely to be a subject teachers are increasingly likely to want to teach (i.e. if politics didn't enter the classroom it would already be being taugh, and might be in some districts).

The older generations were given catered news their entire lives, only in the last few decades have they had to face a ton of potentially insidious information. The younger generations have had to grow up with it.

A good example is that old people regularly click malicious advertising, fall for scams, etc, they're generally not good at applying critical thinking to a computer, where as younger people (typically though I hear this is regressing some with smartphones) know about this stuff and are used to validating their information (or at least have a better "feel" for what's fishy).

Just be aware that we can ALL be manipulated, the only difference is the method. Right now, most manipulation is on a large scale. This means they focus on what works best for the masses. Unfortunately, modern advances in AI mean that automating custom manipulation is getting a lot easier. That brings us back into the firing line.

I'm personally an Aspie with a scientific background. This makes me fairly immune to a lot of manipulation tactics in widespread use. My mind doesn't react how they expect, and so it doesn't achieve the intended result. I do know however, that my own pressure points are likely particularly vulnerable. I've not had the practice resisting having them pressed.

A solid grounding gives you a good reference, but no more. As individuals, it is down to us to use that reference to resist undue manipulation.

2 more...

imagine if they taught critical media literacy in schools. of course that would only be critical media literacy with an american propaganda backdoor but still

2 more...

I mean, you probably are, especially if it's explicitly political. All I can recommend is CONSTANT VIGILANCE!

YouTube's entire business is propaganda: Ads.

Lately the number of ads on YouTube has increased by an order of magnitude. What they managed to accomplish was driving me away.

I watch a lot of history, science, philosophy, stand up, jam bands and happy uplifting content... I am very much so feeding my mind lots of goodness and love it...

At this point, any channel that I know is either bullshit or annoying af I just block. Out of sight out of mind.

Same. I have ads blocked and open YouTube directly to my subbed channels only. Rarely open the home tab or check related videos because of the amount of click bait and bs.

Ohh I just use BlockTube to block channels/ videos I don't want to see.

Just this week I stumbled across a new YT channel that seemed to talk about some really interesting science. Almost subscribed, but something seemed fishy. Went on the channel and saw the other videos, immediately got the hell out. Conspiracies and propaganda lurk everywhere and no one is save. Mind you, I'm about to get my bachelor's degree next year, meaning I have received proper scientific education. Yet I almost fell for it.

I have to clear out my youtube recommendations about once a week... no matter how many times I take out or report all the right-wing garbage, you can bet everything that by the end of the week there will be a Jordan Peterson or PragerU video in there. How are people who aren't savvy to the right-wing's little "culture war" supposed to navigate this?

I find it interesting how some people have so vastly different experience with YouTube than me. I watch a ton of videos there, literally hours every single day and basically all my recommendations are about stuff I'm interested in. I even watch occasional political videos, gun videos and police bodycam videos but it's still not trying to force any radical stuff down my throat. Not even when I click that button which asks if I want to see content outside my typical feed.

My youtube is usually ok but the other day I googled an art exhibition on loan from the Tate Gallery, and now youtube is trying to show me Andrew Tate.

At one point I watched a few videos about marvel films and the negatives about them. One was about how captian marvel wasn't a good hero because she was basically invincible and all powerful etc etc. I started getting more and more suggestions about how bad the new strong female leads in modern films are. Then I started getting content about politically right leaning shit. It started really innocuously and it's hard to figure out if it's leading you a certain way until it gets further along. It really made me think when I'm watching content from new channels. Obviously I've blocked/purged all channels like that and my experience is fine now.

I watch a ton of videos there, literally hours every single day and basically all my recommendations are about stuff I'm interested in.

The algorithm's goal is to get you addicted to Youtube. It has already succeeded. For the rest of us who watch one video a day, if at all, it employs more heavy-handed strategies.

That's a good point. They don't care what I watch. They just want me to watch something.

The experience is different because it's not one algorithm for everyone.

Demographics are targeted differently. If you actually get a real feed, it's only because no one has yet paid YouTube for guiding you towards their product.

It would be an interesting experiment to set up two identical devices and then create different Google profiles for each just to watch the algorithm take them in different directions.

I don't understand how these people can endure enough ads to be lured in by qanon. The people of that generation generally don't know about decent adblockers.

7 more...

the damage that corporate social media has inflicted on our social fabric and political discourse is beyond anything we could have imagined.

This is true, but one could say the same about talk radio or television.

Talk radio or television broadcasts the same stuff to everyone. It's damaging, absolutely. But social media literally tailors the shit to be exactly what will force someone farther down the rabbit hole. It's actively, aggressively damaging and sends people on a downward spiral way faster while preventing them from encountering diverse viewpoints.

I agree it’s worse, but i was just thinking how there are regions where people play ONLY Fox on every public television, and if you turn on the radio it’s exclusively a right-wing propagandist ranting to tell you democrats are taking all your money to give it to black people on welfare.

Well it's kind of true.

Uh, no. For one, Republicans spend like fuck, they just also cut taxes so they end up running up the deficit. Social welfare programs account for a tiny fraction of government budgets. The vast majority is the military and interest payments on debt.

Actually, liberal taxes pay for rural white America.

And it sucks people back in like a breadcrumbing ex when it hasn't seen you active recently.

Yes, I agree - there have always been malevolent forces at work within the media - but before facebook started algorithmically whipping up old folks for clicks, cable TV news wasn't quite as savage. The early days of hate-talk radio was really just Limbaugh ranting into the AM ether. Now, it's saturated. Social media isn't the root cause of political hatred but it gave it a bullhorn and a leg up to apparent legitimacy.

Social media is more extreme, but we can’t discount the damage Fox and people like Limbaugh or Michael Savage did.

Populism and racism is as old as societies. Anciant Greece already had it. Rome fell to it. Christianism is born out of it.

Funnily enough, people always complained about how bad their society was because of this new thing. Like 5 thousand years ago already. Probably earlier even.

Which is not to say we shouldn't do anything about it. We definitely should. But common sense won't save us unfortunately.

1 more...

In the google account privacy settings you can delete the watch and search history. You can also delete a service such as YouTube from the account, without deleting the account itself. This might help starting afresh.

I was so weirded out when I found out that you can hear ALL of your "hey Google" recordings in these settings.

Yeah, anything you send Google/Amazon/Facebook they will keep. I have been moving away from them. Protonmail for email, etc.

Log in as her on your device. Delete the history, turn off ad personalisation, unsubscribe and block dodgy stuff, like and subscribe healthier things, and this is the important part: keep coming back regularly to tell YouTube you don't like any suggested videos that are down the qanon path/remove dodgy watched videos from her history.

Also, subscribe and interact with things she'll like - cute pets, crafts, knitting, whatever she's likely to watch more of. You can't just block and report, you've gotta retrain the algorithm.

Yeah, when you go on the feed make sure to click on the 3 dots for every recommended video and "Don't show content like this" and also "Block channel" because chances are, if they uploaded one of these stupid videos, their whole channel is full of them.

Would it help to start liking/subscribing to videos that specifically debunk those kinds of conspiracy videos? Or, at the very least, demonstrate rational concepts and critical thinking?

Probably not. This is an almost 70 year old who seems not to really think rationally in the first place. She's easily convinced by emotional misinformation.

Probably just best to occupy her with harmless entertainment.

We recommend her a youtube channel about linguistics and she didn't like it because the Phd in linguistics was saying that is ok for language to change. Unfortunately, it comes a time when people just want to see what already confirms their worldview, and anything that challenges that is taken as an offense.

Sorry to hear about you mom and good on you for trying to steer her away from the crazy.

You can retrain YT's recommendations by going through the suggested videos, and clicking the '...' menu on each bad one to show this menu:

(no slight against Stan, he's just what popped up)

click the Don't recommend channel or Not interested buttons. Do this as many times as you can. You might also want to try subscribing/watching a bunch of wholesome ones that your mum might be interested in (hobbies, crafts, travel, history, etc) to push the recommendations in a direction that will meet her interests.

Edit: mention subscribing to interesting, good videos, not just watching.

You might also want to try watching a bunch of wholesome ones that your mum might be interested (hobbies, crafts, travel, history, etc) in to push the recommendations in a direction that will meet her interests.

This is a very important part of the solution here. The algorithm adapts to new videos very quickly, so watching some things you know she's into will definitely change the recommended videos pretty quickly!

OP can make sure this continues by logging into youtube on private mode/non chrome web browser/revanced YT app using their login and effectively remotely monitor the algorithm.

(A non chrome browser that you don't use is best so that you keep your stuff seperate, otherwise google will assume your device is connected to the account which can be potentially messy in niche cases).

Delete all watched history. It will give her a whole new sets of videos. Like a new algorithms.

Where does she watch he YouTube videos? If it's a computer and you're a little techie, switch to Firefox then change the user agent to Nintendo Switch. I find that YouTube serve less crazy videos for Nintendo Switch.

This never worked for me. BUT WHAT FIXED IT WAS LISTENING TO SCREAMO METAL. APPARENTLY THE. YOURUBE KNEW I WAS A LEFTIST then. Weird but I've been free ever since

This comment made me laugh so hard and I don't know why... I love it!

in my head, I read it like someone was talking, and then had to start shouting over the sound of a train passing by, then return to normal volume as it passed

Or like someone whose SCREAMO METAL SUDDENLY TURNED ON AND THEY COULDN'T HEAR themselves thinking anymore.

change the user agent to Nintendo Switch

You mad genius, you

Here is what you can do: make a bet with her on things that she think is going to happen in a certain timeframe, tell her if she's right, then it should be easy money. I don't like gambling, I just find it easier to convince people they might be wrong when they have to put something at stake.

A lot of these crazy YouTube cult channels have gotten fairly insidious, because they will at first lure you in with cooking, travel, and credit card tips, then they will direct you to their affiliated Qanon style channels. You have to watch out for those too.

they will at first lure you in with cooking, travel, and credit card tips

Holy crap. I had no idea. We've heard of a slippery slope, but this is a slippery sheer vertical cliff.
Like that toxic meathead rogan luring the curious in with DMT stories and the like, and this sounds like that particular spore has burst and spread.

Rogan did not invent this, this is the modern version of Scientology's free e-meter reading, and historical cults have always done similar things for recruitment. While you shouldn't be paranoid and suspicious of everyone, but you should know that these things exists and watch out for them.

this is the modern version of Scientology’s free e-meter reading

I actually have a fun story about that. They once had a booth on my college campus so just for fun I let them hook up their e-meter to me. I was extremely dubious that this device did what it claimed, but just for fun to mess with it I tried as hard as I can to think calm and relaxing thoughts. To my amazement, the needle actually went down to the "not stressed" end, so I've gone from thinking that the e-meter is almost certainly bunk to thinking that it is merely very probably bunk.

That isn't the funny part, though. The funny part was that the person administering the test got really concerned and said that the device wasn't working properly and had me take the test again. I did so, and once again the needle went down to the "not stressed" end. The person administering the test then apologized profusely that the device was clearly not working and said that they nonetheless recommended that I take their classes to deal with the stress in my life. So the whole experience was absolutely hilarious, although at the same time incredibly sad because I strongly suspect that the people at the booth weren't saying these things in order to deceive me but because they were genuinely true believers who were incapable of seeing the plain truth even when it stared them in the face.

it's a skin galvanometer. It measures sweating directly based on the fact that sweat is electrically conductive, then interprets more sweat as more stress. this is a fallacy as the fact that stressed people tend to sweat does NOT imply that sweaty people tend to be stressed. this works to the advantage of scientologists because genuinely stressed people will measure high, but so will a lot of unstressed people or people who are only stressed by the fact that they suddenly find themselves in an experiment. False positives and true positives are much more common than true or false negatives, and also much more profitable for scientologists. When you successfully beat the test, the person administering it insisted you go again because it's kinda rigged and they assumed that a second reading would come back with the needle pointing strongly toward "give us a bunch of money".

this is the central fallacy behind lie detectors as well, as they measure skin galvanic response, heart rate, and other things that are correlated with stress but can have myriad other causes, then they assume that people are stressed when they lie, then they take a flying leap to the conclusion that anyone displaying symptoms of stress must be lying.

https://en.wikipedia.org/wiki/Electrodermal_activity

then interprets more sweat as more stress. this is a fallacy as the fact that stressed people tend to sweat does NOT imply that sweaty people tend to be stressed

It's close enough to convince people. If there were no correlation between sweatiness and stress, the E-meter wouldn't be a convincing recruiting tool. If it always just went to "stressed" no matter who used it or how, it wouldn't be as convincing either. The fact that sweatiness and stress are somewhat correlated means that it can be used to bring people into the cult.

oh no one is doubting that it's a good scam. it's quantifiably a truly great scam, having bilked billions from a whole lot of people. but it is a scam, and the electroconductivity of your skin is not a measure of your spiritual state

"Today, we'll learn how to make an asparagus and crabmeat omelet with mozzarella cheese. Also be sure to check out our other channels in the links right here (points to the upper left hand corner of the screen) and here (lower left hand corner). Please remember to like this video and suscribe, it really helps our channel."

it took a total of one hour of youtube autoplay for me to go from "here's how to can your garden veggies at home" to "the (((globalists))) want to outbreed white people and force them into extinction"

My sister always mocked her and starts to fight her about how she has always been wrong before and she just get defensive and has not worked so far, except for further damage their relationship.

No, don't mock her, that's the last thing you should do, don't be emotional, be calm and ask her if she want to take the easy money if she is so sure about these things, it has to be provable, objective events where there is no wiggle room, like the natives with arrows for example.

Cults will always want to isolate their perspective members from their current support groups. Don't let them do that past the point of no return.

See if you can convince her to write down her predictions in a calendar, like "by this date X will have happened".

You can tell her that she can use that to prove to her doubters that she was right and she called it months ago, and that people should listen to her. If somehow she's wrong, it can be a way to show that all the things that freaked her out months ago never happened and she isn't even thinking about them anymore because she's freaked out about the latest issue.

Going to ask my sister to do this, it can be good for both of them.

It's not going to work.

You can't reason someone out of a position they didn't reason themselves into. You have to use positive emotion to counteract the negative emotion. Literally just changing the subject to something like "Remember when we went to the beach? That was fun!" Positive emotions and memories of times before the cult brainwashing can work.

The good part about it is that once she writes her predictions down (or maybe both of them do), they can maybe move on, and talk about something other than the conspiracies.

Assuming all the predictions are wrong, especially if your mom forgot how worked up she was about them months ago but completely forgot about them after that, it can be a good way to talk about how they're manipulating her emotions.

I think it's sad how so many of the comments are sharing strategies about how to game the Youtube algorithm, instead of suggesting ways to avoid interacting with the algorithm at all, and learning to curate content on your own.

The algorithm doesn't actually care that it's promoting right-wing or crazy conspiracy content, it promotes whatever that keeps people's eyeballs on Youtube. The fact is that this will always be the most enraging content. Using "not interested" and "block this channel" buttons doesn't make the algorithm stop trying to advertise this content, you're teaching it to improve its strategy to manipulate you!

The long-term strategy is to get people away from engagement algorithms. Introduce OP's mother to a patched Youtube client that blocks ads and algorithmic feeds (Revanced has this). "Youtube with no ads!" is an easy way to convince non-technical people. Help her subscribe to safe channels and monitor what she watches.

Not everyone is willing to switch platforms that easily. You can't always be idealistic.

That's why I suggested Revanced with "disable recommendations" patches. It's still Youtube and there is no new platform to learn.

Also consider - don't sign in to YouTube. Set uBlock, and Sponsorblock, too - so when youtube does get watched, the ads and promotions get skipped.

Everyone here is missing the point - by signing in to YouTube, you give Google more power to dominate your life.

With a simple inoreader extension in Firefox, you can visit a youtube/youtuber/video page and subscribe the feed.

Search interesting channels and save them to Feedly, or Inoreader.

Pin those to Firefox, so that your feeds are always refreshed and visible.

How does revanced remove the algorithm stuff? I have been using it for a long time but never saw this feature

I don't know for sure, because I actually like my algorithmic recommendations on YT (I've done a good job of carefully curating it to show me a good stuff). But if I had to guess, I'd say it removes "related videos" from its watch page, and removes the "home" tab. So it will show search results and subscriptions, but not the general algorithmic content.

There should be a patch for it that hides the "recommended" feed in the homepage. I'm not certain because I never use Youtube with an account or the official website/app, so I don't get targeted recommendations.

How do you disable algorithmic feeds in revanced? This sounds perfect..

Ooh, you can even set the app to open automatically to the subscription feed rather than the algo driven home. The app does probably need somebody knowledgable about using the app patcher every half-dozen months to update it though.

In addition to everything everyone here said I want to add this; don't underestimate the value in adding new benin topics to her feel. Does she like cooking, gardening, diy, art content? Find a playlist from a creator and let it auto play. The algorithm will pick it up and start to recommend that creator and others like it. You just need to "confuse" the algorithm so it starts to cater to different interests. I wish there was a way to block or mute entire subjects on their. We need to protect our parents from this mess.

Do the block and uninterested stuff but also clear her history and then search for videos you want her to watch. Cover all your bases. You can also try looking up debunking and deprogramming videos that are specifically aimed at fighting the misinformation and brainwashing.

This is a really good idea - so she begins to see the videos of people who were once where she is now.

I've found that if I remove things from my history, it stops suggesting related content, hopefully that helps.

That's what I do, but it requires doing it frequently, as for OP they should start fresh with a new profile which can be under the same login via channels.

Once you have it dialed you can disable your search/watch history.

Some of my personal hobbies/interests tend to appeal to the right wing tinfoil hat fringe (outdoorsy hunting/camping/survivalist stuff, ham radio, a casual interest in guns, and I have a small libertarian streak although I'd generally call myself a liberal these days, etc.) So left unchecked the algorithm will show me some crazy stuff. It used to happen to me occasionally that while I was googling around for info and reviews on some piece of kit I'd be clicking into different forums to see people discussing them then I'd realize at some point that I was on the stormfront forums (and immediately nope the fuck out of there) usually it would be a totally normal conversation of people discussing a ham radio or gun or whatever until halfway down the page someone would drop a comment like "in my opinion, every nationalist should own one of these"

So my YouTube history has been paused for years, probably since well before trump and q anon and everything really took off, and frankly I've been happy with my recommendations. I haven't looked too much into how those recommendations are determined now but it seems to mostly be based on what channels I'm subscribed to.

There is definitely more going into it though, because I get a lot of recommendations about the breed of dog I have despite not subscribing to any dog-related channels, so it's definitely pulling from the rest of my Google search history or social media or something, or secretly keeping track of my history a little in the background somewhere, so you'll probably have to do some pruning in the rest of her online accounts too.

Some things just completely skew your home page with just a single viewing. I am very careful with my watch history, only watching one off stuff from other sites in a private window. That seems to work well.

That was something I tried tried today, deleted the YouTube history and paused it. It didn't had immediately effects, but hope it helps on something.

Another thing that I'd consider doing is to unpause her history a bit, put YouTube to play a few videos about other stuff (topics that she'd watch, based on her tastes, minus the q-anon junk), then pause it again. This might tell youtube "I want to watch this".

Side note: that's how mainstream media is making every single one of us flat and single-minded. So you watched X? Onwards you shall see X nonstop.

I fucking hate this.

I got a new phone recently and wanted to watch some videos on recommend cases and screen protectors, and now 70% of my scrolling feed is related to this God damn phone.

I own it already youtube, I don't need to see any more content on this phone, ffs

I had a similar case. I watch a video about a former jewle thif or bank rober analys the GTA5 Jewel store hist. After that I was getting recommended to watch every single one of his vlog where he was talk about how life in prison was. It went on for about a few weeks but it eventually stopped as I never clicked on them.

I would not pause the history, i would populate the history with balanced contents and then (if you feel like your mother will actively look for radicalized content) pause it.

So that the algorithm will have something"balanced" to work on.

Re not acting immediately: per GDPR they can't keep deleted data on your activity more than a few months (and probably they do before the deadline)

Delete also the search history. After that, search non toxic topics, like live shows of her favourite artists, recipes, nature documentaries...

Video likes have an effect on the algorithm. If she has lots of liked videos, delete them. Any playlists she made would influence the algorithm too.

Not sure if video feedback data gets used, but you can clear them too. I’m talking about those “What did you think of this video?”, “Please tell us more” questions.

Edit: I’m not entirely sure if there is an option to delete the “likes”. You might have to manually “unlike” them.

My mother‘s partner had some small success, on the one hand doing what you do already, unsubscribe from bad stuff and subscribe to some other stuff she might enjoy (nature channels, religious stuff which isn‘t so toxic, arts and crafts..) and also blocked some tabloid news on the router. On the other hand, he tried getting her outside more, they go on long hikes now and travel around a lot, plus he helped her find some hobbies (the arts and crafts one) and sits with her sometimes showing lots of interest and positive affirmations when she does that. Since religion is so important to her they also often go to (a not so toxic or cultish) church and church events and the choir and so on.

She‘s still into it to an extent, anti-vax will probably never change since she can’t trust doctors and hates needles and she still votes accordingly (which is far right in my country) too which is unfortunate, but she‘s calmed down a lot over it and isn‘t quite so radical and full of fear all the time.

Edit: oh and myself, I found out about a book "How to have impossible conversations" and the topics in there can be weird, but it helped me in staying somewhat calm when confronted with outlandish beliefs and not reinforce, though who knows how much that helps if anything (I think her partner contributed a LOT more than me).

By now it is beyond apparent that corporations cannot be relied upon to regulate themselves. In the mindless dynamic they have set in motion, the mindless endgame is to keep us in a perpetual state of anguish.
How this type of vulgarly cynical content is not considered obscene and banned, is a failure of the highest magnitude.

2 more...

youtube has a delete option that will wipe the recorded trend. then just watch a couple of videos and subscribe to some healthy stuff.

As you plan on messing with her feed, I'd like to warn you that a sudden change in her recommendations could seem to her like the whole internet got censored and she can't see the truth anymore. She would be cut off from a sense of community and a sense of having special inside knowledge, and that may make things worse rather than better.

My non-proffessional prediction is that she would get bored with nothing to worry about and start actively seeking out bad news to worry over.

I’d like to warn you that a sudden change in her recommendations could seem to her like the whole internet got censored and she can’t see the truth anymore.

This is exactly the response my neighbors have. "Things get censored, so that must mean that the gov must be hiding this info" It's truly insane to see this happen

What you're describing is a cult or psychosis. Stopping it is a good thing, not something to be worried about.

Well yeh but, the concern is unintended consequences which sounds entirely likely. It's kind of fucked up to even be considering doing this to an adult, entirely entitled to their own choice of viewing habits, without their knowledge and by surreptitious use of their account and it's only dubiously ethical because it's an act of kindness against machine generated manipulation of a far more insidious nature and for far less than altruistic reasons.

You'd hardly want it to backfire after taking this step. By posing this question the OP obviously already considers the pseudo cult his Mum's getting sucked in to be a bad thing so it doesn't need further signalling but being sure to do it carefully with an eye on the actual effect is probably wise or the whole endeavour would be a waste and might push her further in to the waiting arms of lunatics and charlatans.

Cold turkey isn't the best solution if you truly think that is where they're at for exactly the reason they described. How don't you get that?

I agree that cult mindset or psychosis is what she has going on. But suddenly removing the source from her computer without her knowledge could backfire because she's already so invested.

You can manage all recommendations by going to account > Your data in YouTube

From there you can remove the watch history and search history or disable them entirely. It should make it a lot easier to reset the recommendations.

I did that once, All it did was lock-in the algorithm to serve alt-right alpha-male misogynistic BS just because (guessing) google knows I have an odysee account, and it started out by catering to that demographic...

This is the best and most actionable info here OP. From there can you install some parental software to block certain kinds of videos?

Go into the viewing history and wipe it. Then maybe view some more normal stuff to set it on a good track. That should give a good reset for her, though it wouldn't stop her from just looking it up again, of course.

This is the right answer. Removing browsing history is an easy step with very big effect

This. Watch the majority report lol

Oh no! Sam Seder, what a fucking nightmare!

And now the song will repeat in my head, forever.

if you have access to her computer instead of just her account you could try installing the extension channel block (firefox | chrome). with that you can both block individual channels so that videos from these channels will NEVER EVER show up in sidebar recommendations, search results, subscription feeds, etc. you can also set up regex queries to block videos by title eg. the query "nuke*" will stop video recommendations if the title of the video contains "nuke, "nuclear", or any other word that starts with "nuke"

Oof that’s hard!

You may want to try the following though to clear the algorithm up

Clear her YouTube watch history: This will reset the algorithm, getting rid of a lot of the data it uses to make recommendations. You can do this by going to “History” on the left menu, then clicking on “Clear All Watch History”.

Clear her YouTube search history: This is also part of the data YouTube uses for recommendations. You can do this from the same “History” page, by clicking “Clear All Search History”.

Change her ‘Ad personalization’ settings: This is found in her Google account settings. Turning off ad personalization will limit how much YouTube’s algorithms can target her based on her data.

Introduce diverse content: Once the histories are cleared, start watching a variety of non-political, non-conspiracy content that she might enjoy, like cooking shows, travel vlogs, or nature documentaries. This will help teach the algorithm new patterns.

Dislike, not just ignore, unwanted videos: If a video that isn’t to her taste pops up, make sure to click ‘dislike’. This will tell the algorithm not to recommend similar content in the future.

Manually curate her subscriptions: Unsubscribe from the channels she’s not interested in, and find some new ones that she might like. This directly influences what content YouTube will recommend.

HEAVY use of "Don't Recommend Channel" is essential to make YouTube tolerable. Clobber all the garbage.

Yeah "Not interested" doesn't do a damn thing.

It's like a "don't recommend this specific video ... for a few months" button

More like "alright chief, don't care" from my experience.

BlockTube extension is also a must. It will make sure a video or channel NEVER appears again, no matter what.

I curate my feed pretty often so I might be able to help.

The first, and easiest, thing to do is to tell Youtube you aren't interested in their recommendations. If you hover over the name of a video then three little dots will appear on the right side. Clicking them opens a menu that contains, among many, two options: Not Interested and Don't Recommend Channel. Don't Recommend Channel doesn't actually remove the channel from recommendations but it will discourage the algorithm from recommending it as often. Not Interested will also inform the algorithm that you're not interested, I think it discourages the entire topic but it's not clear to me.

You can also unsubscribe from channels that you don't want to see as often. Youtube will recommend you things that were watched by other people who are also subscribed to the same channels you're subscribed to. So if you subscribe to a channel that attracts viewers with unsavory video tastes then videos that are often watched by those viewers will get recommended to you. Unsubscribing will also reduce how often you get recommended videos by that content creator.

Finally, you should watch videos you want to watch. If you see something that you like then watch it! Give it a like and a comment and otherwise interact with the content. Youtube knows when you see a video and then go to the Channel's page and browse all their videos. They track that stuff. If you do things that Youtube likes then they will give you more videos like that because that's how Youtube monetizes you, the user.

To de-radicalize your mom's feed I would try to

  1. Watch videos that you like on her feed. This introduces them to the algorithm.
  2. Use Not Interested and Don't Recommend Channel to slowly phase out the old content.
  3. Unsubscribe to some channels she doesn't watch a lot of so she won't notice.

OP listen to this comment. YouTubes goal is to feed you as much related content to keep you on the site as long as possible. Radical content or otherwise, any engagement to them is positive. You can spend some time curating the feed so that the algorithm works in your favor and the algorithm will adjust very quickly to new interests.

I can confirm that this works quite well. I use these tactics all the time on my parents' feed to keep them from watching too many "crap" news, pardon my french. There's a very notorious news channel in our country that insists on feeding bad (and only the bad) news - I often remove their channel from the suggested feed play videos of funny fails/wins, cute cats, daily dose of internet and other happy nonesense.

Give us an update sometime @driving_crooner@lemmy.eco.br , hope all goes well with your mom.

Delete her youtube history entirely. Rebuild the search algo from scratch.

You can do this? I'm tempted to do it for myself.

Yup pretty easy too. I just checked on my mobile app this is how you do it

  • Click 'Library'
  • On your History section, click 'view all'
  • Then click the three dots upper right
  • Click 'clear all watch history'

You can also do this from the google history management page on the web but not sure the step tho.

I'm not sure about this tbh, I've had YouTube history disabled/cleared for years and it still recommends content that is relevant to things I've been watching. But maybe it at least can help?

1 more...
1 more...

Yes. I’ve nuked my history once to start from a clean slate. Alternatively, you can also delete harmful videos from your history one by one, but that could take a while.

1 more...
1 more...

In googles settings you can turn off watch history and nuke whats there, which should affect recommendations to a degree. I recommend you export her youtube subscriptions to people she likes (that arent qanon), set up an account on an invidious instance, and import those subs. This has the benefit of no ads and no tracking/algorithm making. The recommendations can be turned off too I think. Freetube for desktop and newpipe for andrioid are great 3rd party youtube scrapers. The youtube subs can also be imported to a RSS feed client.

If she watches on a laptop or PC you can change the icons to what she is used to using. Like if she uses Chrome browser you can switch Peertube ixon to a chrome icon so she click that one too.

Make her a new account, pre-train it with some wholesome stuff and explicitly block some shit channels.

First unsub from worst channels and report a few of the worst channels in general feed with "Don't show me this anymore". Then go into actual Google profile settings, not just YouTube. Delete and Pause/Off Watch History and even web history. It will still eventually creep back up, but temp relief.

I listen to podcasts that talk about conspiratorial thinking and they tend to lean on compassion for the person who is lost. I don't think that you can brow-beat someone into not believing this crap, but maybe you can reach across and persuade them over a lot of time. Here are two that might be useful (the first is on this topic specifically, the second is broader):

https://youarenotsosmart.com/transcript-qanon-and-conspiratorial-narratives/ https://www.qanonanonymous.com

I wish you luck and all the best. This stuff is awful.

Switch her to FreeTube on desktop. Can still subscribe to keep a list of channels she likes, but won't get the YouTube algorithm recommendations on the home page.

For mobile something like newpipe for the same algorithm removed experience.

3 more...

This is a suggestion with questionable morality BUT a new account with reasonable subscriptions might be a good solution. That being said, if my child was trying to patronize me about my conspiracy theories, I wouldn't like it, and would probably flip if they try to change my accounts.

Yeah the morality issue is the hard part for me... I've been entrusted by various people in the family to help them with their technology (and by virtue of that not mess with their technology in ways they wouldn't approve of), violating that trust to stop them from being exposed to manipulative content seems like doing the wrong thing for the right reasons.

Really? That seems far-fetched. Various people in the family specifically want you not to mess with their technology? ??

If the algorithm is causing somebody to become a danger to others and potentially themselves, I'd expect in their right state of mind, one would appreciate proactive intervention.

eg. archive.is/https://www.theguardian.com/media/2019/apr/12/fox-news-brain-meet-the-families-torn-apart-by-toxic-cable-news

I think this is pretty much what it boils down to. Where do you draw the line between having the right to expose yourself to media that validates your world view and this media becoming a threat to you to a point where you require intervention?

I've seen lots of people discussion their family's Qanon casualties to recognize it's a legitimate problem, not to mention tragic in many cases, but I would still think twice before 'tricking' someone. What if she realizes what's happening and becomes more radicalized by this? I find that direct conversation, discussion, and confrontation work better; or at least that worked with family that believes in bullshit.

That being said, the harmful effects of being trapped in a bubble by an algorithm are not largely disputed.

Wondering if a qanon person would be offended at you deradicalise them seems like overthinking - it's certainly possible, but most likely fine to do anyway. The only cases where you'd think twice is if something similar has happened before and if this person has a pattern of falling into bad habits/cultish behaviour in the past. In which case you have a bigger problem on your hands or just a temporary one.

I guess despite them being Qanon, I still see and believe in the human in them, and their ultimate right to believe stupid shit. I don't think it's ever 'overthinking' when it comes to another human being's privacy and freedom. I actually think it's bizarre to quickly jump into this and decide to alter the subscriptions behind their back like they're a 2 year old child with not even perception to understand basic shit.

Nobody said this had to be an instant/quick reaction to anything. If you can see that somebody has 'chosen' to fall into a rabbithole and self destruct, becoming an angrier, more hateful (bigoted) and miserable person because of algorithms, dark patterns and unnatural infinite content spirals, I'd recognise that it isn't organic or human-made but in fact done for the profit motive of a large corporation, and do the obvious thing.

If you're on Lemmy because you hate billionaire interference in your life why allow it to psychologically infect your family far more insidiously on youtube?

Exactly, no one said that.

Well, you argued that it's 'bizarre to quickly jump' into it and interfere.

I'm not sure what your point is here.

I guess what I'm saying is that I'll think twice before deciding something isn't for someone to the point where I'd have to interfere without their own knowledge and consent (as opposed to talking to them).

The problem with censorship is that one always assumes they are correct and the other person is deluded. In many cases, like this one, it could be true. In others, it's not. My point was to take caution when deciding something about someone's life.

Consider it from a different angle - if a techy Q-anon "fixed" the algorithm of someone whose device they had access to due to tech help. That person would rightfully be pissed off, even if the Q-anon tech nerd explained that it was for their own good and that they needed to be aware of this stuff etc.

Obviously that's materially different to the situation at hand, but my point is that telling someone that what you've done is necessary and good will probably only work after it's helped. Initially, they may still be resistant to the violation of trust.

If I think of how I would in that situation, I feel a strong flare of indignant anger that I could see possibly manifesting in a "I don't care about your reasons, you still shouldn't have messed with my stuff" way, and then fermenting into further ignorance. If I don't let the anger rise and instead sit with the discomfort, I find a bunch of shame - still feeling violated by the intervention, but sadly realising it was not just necessary, but actively good that it happened, so I could see sense. There's also some fear from not trusting my own perceptions and beliefs following the slide in reactionary thinking. That's a shitty bunch of feelings and I only think that's the path I'd be on because I'm unfortunately well experienced in being an awful person and doing the work to improve. I can definitely see how some people might instead double down on the anger route.

On a different scale of things, imagine if one of my friend who asked for tech help was hardcore addicted to a game like WoW, to the extent that it was affecting her life and wellbeing. Would it be acceptable for me to uninstall this and somehow block any attempts to reinstall? For me, the answer is no. This feels different to the Q-anon case, but I can't articulate why exactly

Better to be embarrassed temporarily than lose a decade of precious time with your family on stuff that makes you angry on the internet.

You're seeing a person who freely made choices here, perhaps like the gamer, but I see a victim of opportunists on youtube, who may have clicked on something thinking it was benign and unknowingly let autoplay and the recommendations algorithm fry their brain.

You probably think the gamer situation is different because they, unlike the boomer, are aware of what's happening and are stuck in a trap of their own making. And yes, in such a situation, I'd talk it out with them before I did anything since they're clearly (in some ways) more responsible for their addiction, even though iirc some people do have a psychological disposition that is more vulnerable that way.

edit: I want to clarify that I do care about privacy, it's just that in these cases of older angry relatives (many such cases), I prioritise mental health.

Maybe not so far fetched. I work in hospice, with the vast majority of the patients I see in their 75-95+yo range. While most have no interest in technology, it's not uncommon for the elderly to have "that grandchild" that helps everyone set up their cell phone, "get the Netflix to work," set up Ring doorbells, etc. I've even known some to ask their grandchild to help their equally elderly neighbor (who doesn't have any local family) with their new TV. It's a thing.

I reworded my comment to clarify (my original wording was a bit clumsy).

I don't really think they're a danger to others anymore than their policy positions in my option are harmful to some percentage of the population. i.e. they're not worried about indigenous populations invading and killing people with poison arrows, but they do buy into some of the anti-establishment doctors when it comes to issues like COVID vaccination.

It's kind of like "I don't think you're a great driver, but I don't think you're such a bad driver I should be trying to subvert your driving." Though it's a bit of a hard line to draw...

After watching a hospice patient cry because (according to her) the Dr interviewed on Fox News talked about how he doesn't do abortions anymore after performing a late term abortion where the mother went into labor and delivered the baby before he could kill it, so he cleaned up the baby and consoled it as he discussed with the parents their options on how to dispatch it after the fact. She was inconsolable. But in drinking Fox's Kool aid, it was the only channel she would watch.

For moral reasons I will take any opportunity to nudge the vulnerable away from the harm certain entities create.

After watching a hospice patient cry because (according to her) the Dr interviewed on Fox News talked about how he doesn’t do abortions anymore after performing a late term abortion where the mother went into labor and delivered the baby before he could kill it, so he cleaned up the baby and consoled it as he discussed with the parents their options on how to dispatch it after the fact. She was inconsolable. But in drinking Fox’s Kool aid, it was the only channel she would watch.

I don't understand what happened in this story.

I think it's hard to have a universal morality. I wouldn't want my family forcing their moral judgements on me if the roles were reversed. e.g. I'm not a car guy, but my family members wouldn't (even if they could) make it so it only drives to "approved" locations.

Like the other commentor said, I think it's better to talk about these issues, though that too can be hard, I can't say I've made much visible traction.

Well, I assume neither you or I are psychologists that can determine what one person may or may not do. However these algorithms are confirmed to be dangerous left unchecked on a mass level - e.g. the genocide in Burma that occurred with the help of FB.

Ultimately if I had a relative in those shoes, I'd want to give them the opportunity to be the best person they can be, not a hateful, screen-addicted crazy person. These things literally change their personality in the worst cases. Not being proactive is cowardly/negligient on the part of the person with the power to do it imo.

you can try selecting "don't recommend this channel" on some of the more radical ones!

That means exactly the opposite for youtube. It will double-down on its "algo" smarts.

"Report for misinformation -> don't recommend this channel" is my usual strategy, but they just keep going. I don't do that with the catholic ones, because they are not that bad, but I'm sure they are a window for the bad ones to enter.

I just want to share my sympathies on how hard it must be when she goes and listens to those assholes on YouTube and believes them but won't accept her family's help telling her what bullshit all that is.

I hope you get her out of that zone op.

Greasemonkey with scripts that block key words or channels. This is the only real option next to taking away YouTube from the old bat.

You can't "de-radicalize" your mom because your mom has never been "radicalized" in the first place - if she was, she'd be spray-painting ACAB on the walls and quoting Bakunin or Marx at everyone.

Your mom has been turned into a reactionary - pretty much the opposite of that which is radical. And since you have access to your mom's youtube account, it's radical content that is required to start levering open the cognitive dissonance that right-wing content attempts to paper over.

However, it can't just be any radical content - it has to be radical content that specifically addresses things that are relevant to your mom. I'll use my own mom as an example... she has always been angry about the fact that my (late) father essentially tossed away the family's nest-egg in his obsessive attempts to become a successful business owner - I showed her some videos explaining how this neolib cult of business-ownership was popularized by the likes of Reagan and Thatcher (which she had no difficulty believing because my dad was a rabid right-winger who constantly brought these two fuckheads up in conversation during the 80s) which led to a lot of good conversations.

Obviously, your mom will not be as receptive to new information as mine is - so you may have to be a bit sneakier about it. But I don't see too many other options for you at this point.

Radicalization !== Radical Marxist. Don't forget radicalism the ideology!. But yes, definitions can change: etymolically and in modern language radicalism politically is to be pushing for a change at the root of something. I think it is pretty fair to say OPs mom could fit this. Radicalism does not have to be and often is not leftist.

Furthermore, in my opinion, reactionary is not a descriptive word- it is wholely used as an insult. It doesn't describe an ideology, it is just a prejorative used to insult ideological opponents. You can again tell by the fact that nobody uninronically considers themselves a reactionary.

radicalism politically is to be pushing for a change at the root of something

You mean that thing right-wing ideology exists to prevent? There is no such thing as a "right-wing radical" - right-wing ideology exists to protect the status quo and destroy or co-opt that which advocates "change at the root..." it has no reason to exist otherwise.

I think it is pretty fair to say OPs mom could fit this.

The paranoid racism displayed by OP's mom isn't "radical" - it's bog-standard right-wing white supremacist colonialist paranoia.

You can again tell by the fact that nobody uninronically considers themselves a reactionary.

They also generally do not self-identify as fascists or white supremacists, and you can easily anger most right-wingers even by just calling them right-wingers - so your point is... what?

You mean that thing right-wing ideology exists to prevent?

Right wing !== The status quo

There is no such thing as a "right-wing radical"

You did not click the link. You did forget Radicalism.

right-wing ideology exists to protect the status quo and destroy or co-opt that which advocates "change at the root..."

So in a socialist country is advocating for liberalism or monarchism or whatever left-wing and advocating for socialism right wing? Were the Nazi's left-wing in the Weimar Republic?

The paranoid racism displayed by OP's mom isn't "radical"

Iirc there was no mention of racism, and I assume based on the mention of "videos in Spanish" that OPs mom is more than likely Latina. But yeah, QAnon is pretty radical seeing as they definitely aren't the status quo.

They also generally do not self-identify as fascists or white supremacists

There are plenty who do, but I've never heard of someone identifying as reactionary.

you can easily anger most right-wingers even by just calling them right-wingers

I've never seen a right-winger being upset by that. Maybe because your definition of right-wing differs a lot from what most people would consider right-wing, they feel they are being mislabeled or mischaracterized.

Right wing !== The status quo

Riiight... I forgot the status quo requires centrists - not right-wingers - to do all the dirty work of violently maintaining the status quo. Because... reasons.

Were the Nazi’s left-wing in the Weimar Republic?

Did you see the Nazis promising capitalists that they would dismantle capitalism? No? How many of those Weimar capitalists got rich and fat off all the slave labor the Nazis provided them? You seem to have a weird idea what the term status quo means.

But yeah, QAnon is pretty radical seeing as they definitely aren’t the status quo.

There is nothing "radical" about QAnon - again, there is no such thing as a "radical right-winger."

I’ve never seen a right-winger being upset by that.

Lol! Why do you think they refer to themselves as "conservatives" and not right-wingers?

The videos you watch on Youtube influence the ones you're recommended. I once put in a couple of 8 hour cat videos for to entertain a feline friend while I was away, and for a while Youtube kept recommending them to me. I had convinced it that I was a cat.

Get her to watch other videos (or even watch them on her behalf using her account), and also mark the awful ones at Not Interested > I Don't Like This Video using the thumbnail menu. It'll take some concerted effort though.

Also, even if you manage to get the recommendations out off of her front page generally, if one shows up and she clicks on it, it'll start recommending them again. Youtube's recommendation algorithm is really crappy, and assume you're all about the things you watch recently.

If she has no account maybe try to disable youtube from setting cookies. So shell start over every time.

Hey can you post an update at some point, I'm interested to see if you manage to lower the radicalising content.

i'm really sorry to hear that you're going through a difficult time rn. it's a very frustrating situation to be in. i'm sending you a hug and good wishes.

I used to watch a lot of controversial/conspiracy/right-wing stuff because I wanted to keep tabs on their crazy. Eventually it all became way too toxic and bad for my mental health but I needed to fix my Youtube feed. There was lots of unsubscribing and channel blocking, but mostly watching of lots of trivial videos was what helped to clean it up.

I don't think downvoting helps, but upvoting does. The algorithm knows people will engage more with content that makes them angry/afraid ... don't let it know when you feel that way.

Maybe scroll through a lot of shorts and then like and subscribe to videos & channels that are less innocuous, or completely trivial (like cooking tips, makeup & fashion for older women, celebrity gossip entertainment, comedy, feel good compilations etc). If you haven't already, definitely subscribe to Daily Dose of Internet. Youtube also seems to like pushing Lo-fi, meditation, and binaural beats (especially at night) ... sub a few of those.

Block those channels who have Rogan or Tate or similar in their video titles. Unsub from the likes of Dr John Campbell & Russell Brand. If she likes news and current affairs, try to find world news channels that are more highly regarded but less likely to be "ultra left" channels that she would recognise from her anti-browsing. Channels like Reuters, AFP, or maybe news stations from other countries (like ABC news in Australia, or CBC and Global News in Canada). Block "Rebel News" and unsub from "News Nation", or anyone else that pushes clickbait headlines. Swap her subscription from "Sky News Australia" (block) to simply "Sky News" - The Australian one is made by NewsCorp (Murdoch) and targets the American right-wing audience, while the other is a legitimate UK news channel which is not owned by Murdoch - she wont notice the subtle difference in the logos, but the content will be dramatically different.

Also, I would recommend a channel called The Why Files ... he does videos on conspiracies, lays out all the conspiracy evidence, and then also lays out any debunking facts. He treats the subjects seriously, so watchers don't feel stupid for believing something, but then does a great job of presenting the debunking facts as well, so viewers get a really balanced presentation about controversial topics.

I hope this is helpful. Good luck.

P.S. If you haven't done so yet, I would recommend watching a documentary called "The Social Dilemma" if you are able to find a copy (it's on Netflix).

It may help to not log on to YouTube at all. If she uses Google accounts, set YouTube to always open in a different container where you're not logged in. The recommendations are set by a single id in the cookies which can easily be reset.

Kittens are good. Maybe some BBC stuff? Get her into the nature shows. Damn near anything on BBC Earth should be great for that.

Diy building stuff: Simone Goertz is awesome, as is Laura Kampf.

For some super chill diy, check out SteadyCraftin

ClickSpring builds super cool old shit and is just beautiful to watch.

Either The Action Lab or Steve Mould should work for understanding science.

For fun experiments and sciencey things, Mark Rober is entertaining.

Want something food related? I’ve heard good things about Tasting History (Max Miller), haven’t checked him out myself, though. Or James Hoffman does a great show all about coffee.

Does she knit or crochet? There are hundreds of channels about yarny things - Tiny Fiber Studios does stuff like that.

Or maybe she likes other things. Sashiko Story teaches the art of sashiko (Japanese embroidery). The Violet Unicorn teaches weaving.

How about music? Tim Reynolds is incredible, as is Marcin.

Maybe some BBC stuff?

Gardener's World, the way the show is presented is soothing yet fascinating at the same time. There are a lot of full episodes on YouTube, and it is healthy content for a 70-year old mind... or much younger, even.

Just stay away from bbc news politics coverage of the uk, it's dogshit, almost as bad as the tabloids. world coverage and other fun stuff is fine.

Make a new account gor youtube. And take out the one she was using out of the app. ( if its just for loggin into the phone then take it out of the phone althogeter). But like someone said earlier, shes just gonna feel like its getting censored or bored and look for that stuff herself. So you better explain to her in a long talk how the internet media machine works, and how to distinguish facts from fiction. Whitch is gonna be a daunting task but somethimes it works. If she just got into watching youtube videos then its should be a little easyer to talk to her about it.

Replace her existing account with a new one. Make up some excuse. Use the don't recommend channel option when a bad recommendation comes up.

If you're keeping the channel, the "Don't recommend this" method while also deleting watch history with controversial content would help. but honestly, closing the account and opening a new one with a fresh recommendation slate is the way.

Invidious or Freetube if she's on a personal computer.

You can go into the view history and remove all the bad videos.

My mom has a similar problem with animal videos. She likes watching farm videos where sometimes there's animals giving birth... And those videos completely ruin the algorithm. After that she only gets animals having sex, and the animal fighting and killing eachother.

It's not going to be easy. most likely she'll be exposed to this shit elsewhere too, not only on youtube. fully reset the browser in ANY case. delete history, cookies, everything.

besides that, I see multiple options. you can consider them all or in various combinations

Any change should be done gradually i think. If the elder person notices that what shes watching is suddenly not the shows she subscribed to. Then she will clam up and refuse to watch the new shows unt OP bring back her "account".

I would be really pissed if my kids would delete my stuff..

I’m not questioning what’s right and wrong or if this even changes someone’s opinion.

It’s merely a technical approach that can somewhat work until it’s filled up with the same crap again in a few weeks.

The person got there once - it’ll likely happen again. But combining this with conversation and actual facts, can be a start.

That's exactly the point. If anyone dicides to cut me of from my shit, I would cut them of.

You can't alter their behaviour with that kind of approach unless they have big dementia already. Imo, the only thing that helps would be talking to her and getting her into other activities.

Could also help to deactivate the personalized advertising functionality in the Google/YouTube settings (basically wipe currently stored preference, the forbid YouTube from making suggestions based on your interests). This will keep her feed fairly generic (and bad, oh boy) so that she would have to actively search for or subscribe to these videos.

Redirect youtube to an invidious instance by editing hosts file

Get your mom on bondi rescue. She will never watch anything again in her life

It has to be content in Spanish, that makes it a little difficult because I watch exclusively content on English so I don't know what to recommend her.

You should consider asking for recommendations somewhere there's more Spanish speaking people, because giving the algorithm something new to recommend is necessary to keep the shit away

To begin with, I'd stop her using the YouTube website logged in.

Set her up with Inoreader - then when you find something interesting, you can visit the 'video' page and subscribe to the feed...

If you log her out, that will mean she just uses it without logging in - which works ok for feeds. She can browse, but it's fresh.

Just a few suggestions that work for me - via RSS feed (not subscribed in Youtube):

  • Juicy Life
  • Sci Show
  • Simonscat
  • FailArmy
  • Daily Dose
  • Vlad Vexler
  • Big Think

With inoreader it's a simple task to mark all items in a folder as read if you want to skip them. Also, with the inoreader account you can log in remotely at any browser, then you can find stuff you think is cool and subscribe - then that will show up in her feeds.

Not only video/youtube.

I love this site, I found a couple of dozen and added them in pyradio app so I've always got interesting listening when I don't want to stare at the screen.

Damn, that drive and listen and window swap are pretty sweet. I will probably show these to my dad.

I had to log into my 84-year-old grandmother's YouTube account and unsubscribe from a bunch of stuff, "Not interested" on a bunch of stuff, subscribed to more mainstream news sources... But it only works for a couple months.

The problem is the algorithm that values viewing time over anything else.

Watch a news clip from a real news source and then it recommends Fox News. Watch Fox News and then it recommends PragerU. Watch PragerU and then it recommends The Daily Wire. Watch that and then it recommends Steven Crowder. A couple years ago it would go even stupider than Crowder, she'd start getting those videos where it's computer voice talking over stock footage about Hillary Clinton being arrested for being a demonic pedophile. Luckily most of those channels are banned at this point or at least the algorithm doesn't recommend them.

I've thought about putting her into restricted mode, but I think that would be too obvious that I'm manipulating the strings in the background.

Then I thought she's 84, she's going to be dead in a few years, she doesn't vote, does it really matter that she's concerned about trans people trying to cut off little boy's penises or thinks that Obama is wearing ankle monitor because he was arrested by the Trump administration or that aliens are visiting the Earth because she heard it on Joe Rogan?

I did the same thing, although putting some videos with other things will declutter the algorithm too

Idk on here, but if you need help on reddit there is a sub called Qanoncasualties and it's basically a support group for family members with Q whoevers.

Next to deleting, you can enable the option to automatically delete it every 3 months.

Google account -> Data & privacy -> History Settings -> YouTube History -> auto-delete

But, the best is to have a new account.

No joke, the answer is for her to stop watching YouTube. I drastically reduced my YouTube consumption from like multiple hours a day to max 30 minutes. The algorithm takes into account how many hours you've watched per day.

Find several wholesome channels that have a decent amount of content and look at the playlists. Make sure auto play is on and just plow through playlist after playlist ever night.

I did this by mistake once with kid videos. Iy wreaked my feed for so long 🤣

In addition to clearing the watch history, I'd suggest making them as "Do not recommend channel" in the Home view, refresh and do the same for the next batch. This will stop the channels popping up at all. Then use it to watch a few more normal ones. Of course the algorithm will pull to the right over time, but removing channels from the recommendation rotation should slow things down.

I too faced this dilemma. So I uninstalled every ad blocker and made it very tedious videos. It kinda helped.

What if you started disliking these extremist videos, and explicitly liking more tame videos with the kind of content she would enjoy? This might tell the algorithm to prefer less polarizing videos over time.

The caveat is, I've heard likes/dislikes "don't count" unless you've watched a certain percentage of the video, at least more than a few seconds. So you can't just sit there and speed through clicking dislike on everything.

Another thing you can try is clicking algorithm-suggested videos that seem less radical than the current one. For example, on the home screen, click on something innocuous that's trending. From "related videos," in any video, pick literally anything that's not what she's already watching.

You will probably have to do this for a while, but eventually she might "latch on" to whatever the algorithm starts suggesting. She almost certainly won't stop watching those radical videos entirely - people that age are set in their ways. But maybe you can get her to watch less.

I thought the algorithm responds to downvotes with more of the same content because negativity drives engagement. I very rarely downvote because I don't want YT to know what I don't like. I do however tell it to completely block rage-bait channels. This works for my home feed but not in trending.

Hmmm, good point. To be honest, I'm not sure if anyone knows how the algorithm works, I'm only guessing.

Maybe content creators would know for sure, but I'm not one :p

If you haven't done so yet, I would recommend watching a documentary called "The Social Dilemma" if you are able to find a copy (it's on Netflix).

try is clicking algorithm-suggested videos that seem less radical than the current one

Have never think on this one but makes a lot of sense. Thank you!

In addition to deleting history, could you get direct access to the account and watch better/different content to hopefully get that stuff more recommended too?

I don't really know what to look up that could interest her without being a window for more or that kind of videos, she likes history, but all the Spanish history channels are whitewash colonialism shit that reinforce the bad algorithm. I'm starting to watch and like videos from channels like "old things never die" or from guys who clean carpets and so on, hopefully that moves the algorithm to more neutral international kinda documentary style of videos.

This may sound outlandish BUT HEAR ME OUT I can use the brand channel myself on your behalf, I speak Spanish so I can discern very neutral anti-propaganda history channels, as well as some crafts and cleaning videos in Spanish too all while blocking all the channels you would tell me. For my kind of work I have a shit ton of free time on my hands I tend to spend 6h-ish hours watching YouTube or sometimes 12-ish if I also set stuff in the background, I will deliver a fully pledged clean algorithm with tons of fresh content for her.

Let me be your YouTube algo dealer. 🤝

The only thing I can really suggest is to start culling some of the worst, and introduce some counter-influences like debunkers, in hopes of her feed organically at least introducing some of that to her over time.

I am not sure if she would watch, but there is at least the possibility. Even if you kill her account in some fashion, if she just rejoins and starts looking for the same stuff again...

Dislike and “do not suggest” as many of such videos as you find on recommendation. Block a few channels. Like some random non aggressive channels and non political channels and videos. See if any bad channels are subscribed and unsub+block them.

So this for few days and her algos will hopefully be clean.

Create a new YouTube profile with the same name and picture and set it as the default watch profile. Then just preload it with subscriptions from a varied amount of content and a few stray likes here and there for content she might like in general. Don’t mention what happened and just remove the old profile after a while.

block youtube/facebook/and any social media . though dns .with pihole for the time being . and in the meantime try to reset her youtube(or delete and create new one -if she doesnt use gmail)

You can't, it's like trying to fix your mom's phone. The only way is resetting it, try resetting your mom.

i dont know but im just glad youre alive bc the fish bumped you out of the water with its nose

If it's on the phone, clear the cache and log out of the phone. It will reset the recommendeds to the shitty defaults. If it's on PC, clear cookies and cache. How you choose to handle the account itself is entirely up to you.

Your mum is 70. As far as you're concerned, you're her Guardian now. Block that shit and put old NatGeo specials on repeat. "Old" because the current NatGeo is shitty Yank Trucker Alien Tuna Trash. You must treat her like a child who needs guidance and fresh air. Limit the tv and internet to 3 hours a day. Make her spend daytime hours socializing irl with pensioners club etc.

70 isn't THAT old.

Anyone who reaches the average age of a U.S. Senator is basically senile.

My dad is 70 in a few years and still has Boeing, Northrop Grumman, and various other aerospace companies asking him to consult for them in his retirement.

I don’t totally disagree with you, but let’s not generalize.

What? That's pretty insulting, really - your statement could be applied to any age. Generalisation isn't cool.

How many 70 year olds do you know? Because the ones I know are either still in work or fully active. Retirement age here in the UK is 67, are you saying those in full time work should be treated like children? My mum and dad are in that age group and Dad's set up his own Plex server for christ's sake.

The people I know who have the worst issues with social media and shit like that are half the age, and the less said about my daughter's age group and their tiktok addiction the better.

Migrate her over to a Federated alternative. No aggressive algorithms here

accept her right to freedom of information

If she wasn't a cognitively declining 70 something year old, I would agree with you. When people get older their minds revert more to a childlike understanding of the world. Would you let a child look at whatever they want to on the internet?