this post was submitted on 23 Jan 2025
1118 points (97.3% liked)

Technology

61081 readers
3079 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

TLDR if you don't wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for "Kamilia" would lose you "10000 rizz", and how voting for Trump would get you "1 million rizz".

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn't necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 12 points 2 days ago (2 children)

Same happened to me (live in WA) but not only do I get pro-tyranny ads and Broprah (Rogan) shorts, I also get antivax propaganda.

I always use the “show less of this” option or outright remove it from my feed. Seems better now.

load more comments (2 replies)
[–] [email protected] 29 points 2 days ago

I don't think it makes me feel better to know that our descent into fascism is because gru promised 1MM rizz for it

[–] [email protected] 2 points 2 days ago

I use YouTube and don't get much far-right content. My guess is it's because I don't watch much political content. I use a podcatcher and websites for that. If I watched political content, it might show me some lurid videos promoting politics I disagree with because that tends to keep viewers engaged with the site/app longer than if they just showed videos consistent with the ideology I seek out. That gives people the feeling they're trying to push an ideology.

I made that up without any evidence. It's just my guess. I'm a moderate libertarian who leans Democratic because Republicans have not even been pretending to care about liberty, and for whatever reason it doesn't recommend the far-right crap to me.

[–] [email protected] 1 points 2 days ago

I've been happy with BlockTube for blocking channels or single videos. I also use YouTube Shorts Redirect for automatically converting shorts into regular videos.

[–] [email protected] 2 points 2 days ago

If the channel is popular, those videos will get recommend

Of it has engagement on top of that, you are fucked, it will definitely get recommend to you.

Either block the channel, the user, or use in incognito. Or don't

[–] [email protected] 1 points 2 days ago* (last edited 2 days ago)

Almost no corporation has benefits operating in a liberal/left country, they are harder to exploit and make profit of. Why would they promote things like worker protection, parental leave, unions, reducing their own rights to favor the society, paying for healthcare etc? Edit: Wording

[–] [email protected] 42 points 2 days ago (2 children)

Alt right videos are made to elicit outrage, hate, and shock which our lizard brains react to more due to potential danger than positive videos spreading unity and love. It’s all about getting as many eyeballs on the video to make money and thi is the way that’s most effective.

[–] [email protected] 6 points 2 days ago (3 children)

So all this stuff about climate change being an existential threat is actually alt right?

[–] [email protected] 15 points 2 days ago

Are people making clickbait/ragebait articles about climate change? Are people seeking out clickbait about climate change?

I don't need to be constantly reminded of climate change, but an old "friend" is constantly telling me about the politics of video games he doesn't even have a system to play with.

[–] [email protected] 12 points 2 days ago* (last edited 2 days ago) (1 children)

All alt-right content is made to generate outrage but content that generates outrage does not have to be necessarily alt-right.

[–] [email protected] 7 points 2 days ago (1 children)

Another important part of alt right bullshit is that they blame people that viewers can easily identify on the streets. Crime? It's the immigrants and blacks! Shit economy? Jews and the deep state!

So, I guess the only way to fight climate change is by accusing every petrol CEO of being a deep state Jew gay communist

[–] [email protected] 2 points 2 days ago (1 children)

I don't think you meant it that way, but how are Jews 'easily identifiable' on the street?

[–] [email protected] 3 points 2 days ago

Ever seen that caricature of a Jew? The one with a huge nose and a grin, curly hair? That's how the idiots picture all Jews. It doesn't matter that it's a racist/xenophobic stereotype, it has a "clear, recognizable face" of the enemy. It creates an image of "the enemy" in their mind

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 11 points 2 days ago* (last edited 2 days ago)

The people where I live are -- I guess -- complete morons because whenever I try to check out Youtube without being logged in, I get the dumbest of dumb content.

But as another weird data point, I once suggested my son check out a Contrapoints video which I found interesting and about 1 year later she told me she wanted to get a surgery -- I don't exactly remember which kind as I obviously turned immediately into a catatonic far right zombie.

[–] [email protected] 7 points 2 days ago* (last edited 2 days ago) (1 children)

Real talk: I've been using YouTube without an account and with some ad blocking stuff installed. Based on what I'm seeing, I'm pretty sure the algorithm's datapoint for me is "He was born with a penis and is ok with that."

When I lose my better judgement and start scrolling shorts like an idiot, It is fight videos (IRL, movie scenes, UFC and boxing), auditing, Charlie Kirk and right-wing influencers, and the occasional clip from Shoresy on the basis "He might be Canadian too, idk".

It is noticibly weird, and I have brought it up to my kid who uses an account, is not what Youtube believes me to be, and whose shorts feed is very different.

We do both get that guy who opens Pokemon cards with a catchy jingle, though.

[–] [email protected] 3 points 2 days ago

I check friends' Snapchat stories from time to time, and Snapchat suggests public stories on the same page. I think Snapchat has the same sort of singular data point on me that "this account is likely a straight man", because most of what they show me are sports clips, woman influencers in revealing clothing, and right-wing influencers talking about culture war stuff. I never view any of that sort of stuff, but it still shows up any time I try to check my friend's stories. I guess I view public stories so infrequently that they just give me a default generic man feed.

[–] [email protected] 4 points 2 days ago (1 children)

fresh YouTube account

change their location to a random city in the US

yeah but you're still bound to IP addresses. I was under the impression Youtube used those for their profiling

[–] [email protected] 9 points 2 days ago (2 children)

Probably used a VPN I'd imagine, but I haven't watched the video

[–] [email protected] 2 points 2 days ago

most likely yes.

My point is, if youtube customizes the feeds based on the IPs, then the youtube accounts used are not really "fresh" but there's already some data entered into the profiles upon their creation.

[–] [email protected] 3 points 2 days ago (1 children)

You are correct. He used a VPN for several US locations in the video. He then compared what content was shown in different regions of the US to see if everyone sees the same thing or if the content is radically different depending on where you are.

[–] [email protected] 2 points 2 days ago

Even then, do they really think YouTube has not flagged and taken into account VPNs connections on their suggestions?

If a lot of people are using that VPN ip range, it's basically the same as doing nothing

load more comments
view more: ‹ prev next ›