this post was submitted on 23 Jan 2025
1131 points (97.2% liked)

Technology

61203 readers
4636 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

TLDR if you don't wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for "Kamilia" would lose you "10000 rizz", and how voting for Trump would get you "1 million rizz".

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn't necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

top 50 comments
sorted by: hot top controversial new old
[–] Zoot@reddthat.com 2 points 1 day ago (1 children)

Just anecdotal but I only ever watch duck videos or funny animal videos with occasional other funnies or crazy science things, and that's still all I ever get. Other days I get plenty of cool music like tesla coils making music or other piano music.

Am I youtubing wrong?

[–] UraniumBlazer@lemm.ee 1 points 1 day ago

Nah good for you. Maybe it's because of your geographical location/you just being lucky? I have experienced what the video above says quite a lot though.

I'm not American, so I didn't exactly see a lot of Trump (although there was some amount of it). I largely saw a lot of Hindu nationalist content (cuz of my geographical location). The more I disliked the videos, the more they got recommended to me. It was absolutely pathetic.

[–] pennomi@lemmy.world 202 points 6 days ago (9 children)

I think the explanation might be even simpler - right wing content is the lowest common denominator, and mindlessly watching every recommended short drives you downward in quality.

[–] Plebcouncilman@sh.itjust.works 65 points 6 days ago* (last edited 6 days ago) (2 children)

I was gonna say this. There’s very little liberal or left leaning media being made and what there is is mostly made for a female or LGBTQ audience. Not saying that men cannot watch those but there’s not a lot of “testosterone” infused content with a liberal leaning, one of the reasons Trump won was this, so by sheer volume you’re bound to see more right leaning content. Especially if you are a cisgender male.

Been considering creating content myself to at least stem the tide a little.

[–] BassTurd@lemmy.world 40 points 6 days ago (3 children)

I think some of it is liberal media is more artsy and creative, which is more difficult to just pump out. Creation if a lot more difficult than destruction.

[–] Plebcouncilman@sh.itjust.works 15 points 6 days ago

Not necessarily. For example a lot of “manosphere” guys have taken a hold of philosophy,health and fitness topics, a liberal influencer can give a liberal view on these subjects. For example in philosophy, explain how Nietzsche was not just saying that you can do whatever the fuck you want, or how stoicism is actually a philosophy of tolerance not of superiority etc. there’s really a lot of space that can be covered.

load more comments (2 replies)
load more comments (1 replies)
[–] credo@lemmy.world 15 points 6 days ago (4 children)

I refuse to watch those shit shorts; I think your theory has legs. Unfortunately there doesn’t seem to be a way to turn them off.

load more comments (4 replies)
load more comments (7 replies)
[–] gmtom@lemmy.world 43 points 5 days ago (2 children)

So you're saying we need to start pumping out low quality left wing brainrot?

[–] eronth@lemmy.world 20 points 4 days ago (1 children)

Insanely, that seems to be the play. Not logic or reason, but brainrot and low blows. Which is a bit at odds with the actual desire.

[–] xor@lemmy.dbzer0.com 8 points 4 days ago

fight fire with fire i guess….
maybe people get on board quicker if they feel the emotions first, and then learn the logic….
one good example is Noam Chompsky: every thing is says is gold, but he says it so slow and dispassionately even people who agree with him find it hard to watch.

[–] sudo@programming.dev 7 points 4 days ago* (last edited 4 days ago)

It only must be extremely simplified and evoke emotional reactions. That's just basic propaganda rules. The brainrot quality of the content is a consequence of the sheer quantity of the content. You can't make that volume of content and without making fully automated ai slop.

What the experiment overlooks is that there are PR companies being paid to flood YouTube with rightwing content and are actively trying to game its algorithm. There simply isn't a left-wing with the capital to manufacture that much content. No soros-bucks for ai minions in keffiyehs talking about medicare.

[–] victorz@lemmy.world 73 points 6 days ago (2 children)

I keep getting recommendations for content like "this woke person got DESTROYED by logic" on YouTube. Even though I click "not interested", and even "don't recommend channel", I keep getting the same channel, AND video recommendation(s). It's pretty obvious bullshit.

[–] SaharaMaleikuhm@feddit.org 22 points 6 days ago (6 children)

Anything but the subscriptions page is absolute garbage on that site. Ideally get an app to track your subs without having to have an account. NewPipe, FreeTube etc.

load more comments (6 replies)
[–] lennivelkant@discuss.tchncs.de 19 points 6 days ago (8 children)

You'd think a recommendation algorithm should take your preferences into account - that's the whole justification for tracking your usage in the first place: recommending relevant content for you...

load more comments (7 replies)
[–] socialmedia@lemmy.world 49 points 6 days ago* (last edited 6 days ago) (3 children)

I realized a while back that social media is trying to radicalize everyone and it might not even be entirely the oligarchs that control its fault.

The algorithm was written with one thing in mind: maximizing engagement time. The longer you stay on the page, the more ads you watch, the more money they make.

This is pervasive and even if educated adults tune it out, there is always children, who get Mr. Beast and thousands of others trying to trick them into like, subscribe and follow.

This is something governments should be looking at how to control. Propaganda created for the sole purpose of making money is still propaganda. I think at this point that sites feeding content that use an algorithm to personalize feeds for each user are all compromised.

[–] whoisearth@lemmy.ca 13 points 6 days ago (1 children)

The problem is education. It's a fools game to try and control human nature which is the commodification of all and you will always have commercials and propaganda

What is in our means is to strengthen education on how to think critically and understanding your environment. This is where we have failed and I'll argue there are people actively destroying this for their own gain.

Educated people are dangerous people.

It's not 1984. It's Brave New World. Aldous Huxley was right.

[–] Dark_Arc@social.packetloss.gg 11 points 6 days ago (6 children)

I think we need to do better than just say "get an education."

There are educated people that still vote for Trump. Making it sound like liberalism is some result of going to college is part of why so many colleges are under attack.

From their perspective I get it, many of the Trump voters didn't go, they hear that and they just assume brainwashing.

We need to find a way to teach people to sort out information, to put their immediate emotions on pause and search for information, etc, not just the kind of "education" where you regurgitate talking points from teachers, the TV, or the radio as if they're matter of a fact ... and the whole education system is pretty tuned around regurgitation, even at the college level. A lot of the culture of exploration surrounding college (outside of the classroom) is likely more where the liberal view points come from and we'd be ill advised to assume the right can't destroy that.

load more comments (6 replies)
[–] ayyy@sh.itjust.works 7 points 5 days ago

This discussion existed before computers. Before that it was TV and before that it was radio. The core problem is ads. They ruined the internet, TV, radio, the press. Probably stone tablets somehow. Fuck ads.

load more comments (1 replies)
[–] HoMaster@lemm.ee 42 points 6 days ago (8 children)

Alt right videos are made to elicit outrage, hate, and shock which our lizard brains react to more due to potential danger than positive videos spreading unity and love. It’s all about getting as many eyeballs on the video to make money and thi is the way that’s most effective.

[–] sudo@programming.dev 4 points 4 days ago

There's also an entire industry around mass producing this content and deliberately gaming the algorithm.

load more comments (7 replies)
[–] HawlSera@lemm.ee 60 points 6 days ago (3 children)

I hate the double standards

On a true crime video: "This PDF-File game ended himself after he was caught SAing this individual.... Sorry Youtube forces me to talk like that or I might get demonetized" Flagged for discussing Suicide

On PragerU: "The Transgender Agenda is full of rapists and freaks who will sexually assault your children, they are pedophiles who must be dealt with via final solution!" Completely fucking acceptable!

load more comments (3 replies)
[–] danciestlobster@lemm.ee 29 points 5 days ago

I don't think it makes me feel better to know that our descent into fascism is because gru promised 1MM rizz for it

[–] Blackmist@feddit.uk 36 points 6 days ago (1 children)

From my anecdotal experiences, it's "manly" videos that seem to lead directly to right wing nonsense.

Watch something about how a trebuchet is the superior siege machine, and the next video recommended is like "how DEI DESTROYED Dragon Age Veilguard!"

[–] Valmond@lemmy.world 22 points 6 days ago

Or "how to make ANY woman OBEY you!"

Check out a short about knife sharpening or just some cringe shit and you're all polluted.

[–] ragebutt@lemmy.dbzer0.com 47 points 6 days ago* (last edited 6 days ago)

Do these companies put their fingers on the scale? Almost certainly

But it’s exactly what he said that’s what brought us here. They have not particularly given a shit about politics (aside from no taxes and let me do whatever I want all the time). However, the algorithms will consistently reward engagement. Engagement doesn’t care about “good” or “bad”, it just cares about eyes on it, clicks, comments. And who wins that? Controversial bullshit. Joe Rogan getting elon to smoke weed. Someone talking about trans people playing sports. Etc

This is a natural extension of human behavior. Human behavior occurs because of a function. I do x because of a function, function being achieving reinforcement. Attention, access to something, escaping, or automatic.

Attention maintained behaviors are tricky because people are shitty at removing attention and attention is a powerful reinforcer. You tell everyone involved “this person feeds off of your attention, ignore them”. Everyone agrees. The problematic person pulls their bullshit and then someone goes “stop it”. They call it negative reinforcement (this is not negative reinforcement. it’s probably positive reinforcement. It’s maybe positive punishment, arguably, because it’s questionable how aversive it is).

You get people to finally shut up and they still make eye contact, or non verbal gestures, or whatever. Attention is attention is attention. The problematic person continues to be reinforced and the behavior stays. You finally get everyone to truly ignore it and then someone new enters the mix who doesn’t get what’s going on.

This is the complexity behind all of this. This is the complexity behind “don’t feed the trolls”. You can teach every single person on Lemmy or reddit or whoever to simply block a malicious user but tomorrow a dozen or more new and naive people will register who will fuck it all up

The complexity behind the algorithms is similar. The algorithms aren’t people but they work in a similar way. If bad behavior is given attention the content is weighted and given more importance. The more we, as a society, can’t resist commenting, clicking, and sharing trump, rogan, peterson, transphobic, misogynist, racist, homophobic, etc content the more the algorithms will weight this as “meaningful”

This of course doesn’t mean these companies are without fault. This is where content moderation comes into play. This is where the many studies that found social media lead to higher irritability, more passive aggressive behavior and lower empathetization could potentially have led us to regulate these monsters to do something to protect their users against the negative effects of their products

If we survive and move forward in 100 years social media will likely be seen in the way we look at tobacco now. An absolutely dangerous thing that was absurd to allowed to exist in a completely unregulated state with 0 transparency as to its inner workings

[–] SamuelRJankis@lemmy.world 52 points 6 days ago (3 children)

Instagram is probably notably worse, I have a very establish account that should be very anti that sort of thing and it keeps serving up idiotic guru garbage.

Tiktok is by far the best in this aspect, at least before recent weeks.

[–] IllNess@infosec.pub 17 points 6 days ago (1 children)

A couple of years ago, I started two other Instagram accounts besides my personal one. I needed to organize and have more control of what content I want to see at times I choose. One was mostly for combat sports, other sports, and fitness. The second one was just food.

The first one, right off the bat showed me girls with OnlyFan accounts in the discovery page. Then after a few days, they begin showing me right wing content, and alpha male garbage.

The second one, the food account, showed alternative holistic solutions. Stuff like showing me 10 different accounts of people suggesting I consume raw milk. They started sending me a mix of people who just eat meat and vegans.

It's really wild what these companies show you to complete your profile.

load more comments (1 replies)
load more comments (2 replies)
[–] gimmelemmy@lemmy.world 7 points 4 days ago

Yeah it sure does. There is no way that garbage should be showing up for me, but yet...

[–] Yerbouti@sh.itjust.works 7 points 4 days ago (3 children)

There's a firefox extension to hide short and another to default to your subscription. Along with ublock, those are the only things that makes youtube usable.

load more comments (3 replies)
[–] x00z@lemmy.world 24 points 6 days ago (4 children)

Filter bubbles are the strongest form of propaganda.

load more comments (4 replies)
[–] doortodeath@lemmy.world 19 points 6 days ago (1 children)

I don't know if anyone of you still looks at memes on 9gag, it once felt like a relatively neutral place but the site slowly pushed right wing content in the last years and is now infested with alt-right and even blatantly racist "memes" and comment-sections. Fels to me like astroturfing on the site to push viewers and posters in some political direction. As an example: in the span during US-election all of a sudden the war on palestine became a recurring theme depicting the Biden admin and jews as "bad actors" and calling for Trump; after election it became a flood of content about how muslims are bad people and we shouldn't intervene in palestine...

load more comments (1 replies)
[–] GhostlyPixel@lemmy.world 21 points 6 days ago* (last edited 6 days ago) (3 children)

The view farming in shorts makes it even harder to avoid as well. Sure, I can block the JRE channel, for example, but that doesn’t stop me from getting JRE clips from probably day-old accounts which just have some shitty music thrown on top. If you can somehow block those channels, there’s new ones the next day, ad infinitum.

It’s too bad you can’t just disable the tab entirely, I feel like I get sucked in more than I should. I’ve tried browser extensions on mobile which remove the tab, but I haven’t had much luck with PiPing videos from the mobile website, so I can’t fully stop the app.

load more comments (3 replies)
[–] shalafi@lemmy.world 25 points 6 days ago (17 children)

I'll get downvoted for this, with no explanation, because it's happened here and on reddit.

I'm a liberal gun nut. Most of my limited YouTube is watching gun related news and such. You would think I'd be overrun with right-wing bullshit, but I am not. I have no idea why this is. Can anyone explain? Maybe because I stick to the non-politcal, mainstream guntubers?

The only thing I've seen start to push me to the right was watching survival videos. Not some, "dems gonna kill us all" bullshit, simply normal, factual stuff about how to survive without society. That got weird fast.

load more comments (17 replies)
[–] LovableSidekick@lemmy.world 10 points 5 days ago* (last edited 5 days ago) (5 children)

I found youtube shorts very annoying, because I have an attention span and can focus on something for more than 30 seconds. But if you right-click the three dots on a few Shorts sections and click Not Interested, youtube gets the hint and stops offering them to you. Win-win!

[–] T00l_shed@lemmy.world 9 points 5 days ago (4 children)

It doesn't seem to work for me, I also keep reporting ads and blocking ads, and they keep serving them up to me. So I have decided to disable YouTube on my phone.

load more comments (4 replies)
load more comments (4 replies)
[–] WrenFeathers@lemmy.world 12 points 5 days ago (2 children)

Same happened to me (live in WA) but not only do I get pro-tyranny ads and Broprah (Rogan) shorts, I also get antivax propaganda.

I always use the “show less of this” option or outright remove it from my feed. Seems better now.

load more comments (2 replies)
[–] labbbb2@thelemmy.club 1 points 3 days ago* (last edited 3 days ago)

I have encountered this too. Around 2 months ago I visited YouTube to watch some video, rejected all cookies and there were "woke" videos everywhere in recommendations. I used it without an account.

[–] Valmond@lemmy.world 14 points 6 days ago

I bet thise right wing shorts are proposed and shoehorned in everywhere because someone pays for the visibility. Simple as that.

[–] vga@sopuli.xyz 11 points 6 days ago* (last edited 6 days ago)

The people where I live are -- I guess -- complete morons because whenever I try to check out Youtube without being logged in, I get the dumbest of dumb content.

But as another weird data point, I once suggested my son check out a Contrapoints video which I found interesting and about 1 year later she told me she wanted to get a surgery -- I don't exactly remember which kind as I obviously turned immediately into a catatonic far right zombie.

[–] FireTower@lemmy.world 15 points 6 days ago (10 children)

Saying it disproportionately promotes any type of content is hard to prove without first establishing how much of the whole is made up by that type.

The existence of proportionately more "right" leaning content than "left" leaning content could adequately explain the outcomes.

load more comments (10 replies)
[–] Sgt_choke_n_stroke@lemmy.world 7 points 5 days ago (1 children)

Instagram does the same thing with "dark jokes" and really weird ufo and conspiracy videos it really sucks

load more comments (1 replies)
[–] LovableSidekick@lemmy.world 3 points 4 days ago (1 children)

Does this mean youtube preferentially selects alt-right shorts, or alt-right people make more shorts? Or some other thing entirely? Jump to your own conclusion.

[–] RisingSwell@lemmy.dbzer0.com 3 points 4 days ago

YouTube selects what gives YouTube the most views for the longest time. If that's right wing shorts, they don't care.

[–] ohlaph@lemmy.world 16 points 6 days ago (1 children)

If I see any alt-right content, I immediately block the account and report it. I don't see any now. I go to yourube for entertainment only. I don't want that trash propaganda.

load more comments (1 replies)
load more comments
view more: next ›