YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead

Anonymau5@lemmy.world to Technology@lemmy.world – 1114 points –
YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead
fortune.com

cross-posted from: https://lemmy.world/post/3320637

YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.

300

You are viewing a single comment

I sub to primarily leftist content and their YouTube shorts algorithm insists on recommending the most vile far right content on the planet. It is to the point that I'm convinced YouTube is intentionally trying to shift people far right

I primarily watch woodworking or baking content on Youtube. I feel like the far right content is super prevalent with Shorts. I'll watch something like a quick tool review, and the next video will be someone asking folks on the street if it's ok to be white. What color you are isn't your decision, but what you do every day is, and being some dumbass white kid accosting black tourists in Times Square for shitty reaction content is just gross.

It doesn't matter how often I say I dislike the content, block channels or whatever, Youtube has just decided it's going to check in from time to time and see if I want to let loose my inner Boomer and rage with Rogan.

1 more...

It could be that pushing videos on the other side of the political spectrum gets interactions in the form of people sharing/commenting on it. Even if you disagree, going "Why does YouTube recommend this, this is awful" is still a share.

The algorithm prioritises interactions above all else, and fewer things get people interacting more than being wrong, or them disagreeing vehemently.

1 more...

This is happening on my FB video feed. I watch a funny chick called Charlotte Dobre and she does funny reaction videos. I honestly love her, but all my algorithm shows me for recommendations are these cop brutality videos with comments praising the cops, and right wing crap that praises Abbotts wall and desantis dictatorship. It drives me nuts, and no matter howany pages I block I always get more right wing recommended crap videos

Wow I am so surprised by this. I watch mainly tech and gardening YouTube and my shorts have been extremely applicable to me.

Even when I use a new computer like at work the shorts are mostly pop culture.

Didn't make shorts any less annoying though

I literally only get Marvel Snap/general gaming, College Humor, tech, educational, stand up comedy, and drones. That's it. I don't mean to victim blame, but it learns what you click and what you stay to watch.

Mine acted similar to yours. I recently started watching a few more short videos and now it's showing me an unfortunate amount of that far right nonsense.

I am pretty sure it is just showing politically charged content based on people watching other politically charged content. I feel the blame is misdirected at something that only provides content people will like based on their past.

Depends entirely on what you're subscribed to, if you have multiple linked youtube accounts (such as the premium family plan) it depends on what THEY'RE subscribed to, and depends on location (my recommendations at home and my recommendations at work are wildly different)

2 more...