this post was submitted on 20 Mar 2024
1012 points (98.0% liked)

Technology

59495 readers
3114 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Minotaur@lemm.ee 158 points 8 months ago (41 children)

I really don’t like cases like this, nor do I like how much the legal system seems to be pushing “guilty by proxy” rulings for a lot of school shooting cases.

It just feels very very very dangerous and ’going to be bad’ to set this precedent where when someone commits an atrocity, essentially every person and thing they interacted with can be held accountable with nearly the same weight as if they had committed the crime themselves.

Obviously some basic civil responsibility is needed. If someone says “I am going to blow up XYZ school here is how”, and you hear that, yeah, that’s on you to report it. But it feels like we’re quickly slipping into a point where you have to start reporting a vast amount of people to the police en masse if they say anything even vaguely questionable simply to avoid potential fallout of being associated with someone committing a crime.

It makes me really worried. I really think the internet has made it easy to be able to ‘justifiably’ accuse almost anyone or any business of a crime if a person with enough power / the state needs them put away for a time.

[–] dgriffith@aussie.zone 143 points 8 months ago (2 children)

This appears to be more the angle of the person being fed an endless stream of hate on social media and thus becoming radicalised.

What causes them to be fed an endless stream of hate? Algorithms. Who provides those algorithms? Social media companies. Why do they do this? To maintain engagement with their sites so they can make money via advertising.

And so here we are, with sites that see you viewed 65 percent of a stream showing an angry mob, therefore you would like to see more angry mobs in your feed. Is it any wonder that shit like this happens?

[–] PhlubbaDubba@lemm.ee 38 points 8 months ago (6 children)

It's also known to intentionally show you content that's likely to provoke you into fights online

Which just makes all the sanctimonious screed about avoiding echo chambers a bunch of horse shit, because that's not how outside digital social behavior works, outside the net if you go out of your way to keep arguing with people who wildly disagree with you, your not avoiding echo chambers, you're building a class action restraining order case against yourself.

load more comments (6 replies)
load more comments (1 replies)
[–] Zak@lemmy.world 64 points 8 months ago (9 children)

I think the design of media products around maximally addictive individually targeted algorithms in combination with content the platform does not control and isn't responsible for is dangerous. Such an algorithm will find the people most susceptible to everything from racist conspiracy theories to eating disorder content and show them more of that. Attempts to moderate away the worst examples of it just result in people making variations that don't technically violate the rules.

With that said, laws made and legal precedents set in response to tragedies are often ill-considered, and I don't like this case. I especially don't like that it includes Reddit, which was not using that type of individualized algorithm to my knowledge.

[–] refurbishedrefurbisher@lemmy.sdf.org 19 points 8 months ago (5 children)

This is the real shit right here. The problem is that social media companies' data show that negativity and hate keep people on their website for longer, which means that they view more advertisement compared to positivity.

It is human nature to engage with disagreeable topics moreso than agreeable topics, and social media companies are exploiting that for profit.

We need to regulate algorithms and force them to be open source, so that anybody can audit them. They will try to hide behind "AI" and "trade secret" excuses, but lawmakers have to see above that bullshit.

Unfortunately, US lawmakers are both stupid and corrupt, so it's unlikely that we'll see proper change, and more likely that we'll see shit like "banning all social media from foreign adversaries" when the US-based social media companies are largely the cause of all these problems. I'm sure the US intelligence agencies don't want them to change either, since those companies provide large swaths of personal data to them.

load more comments (5 replies)
load more comments (8 replies)
[–] galoisghost@aussie.zone 39 points 8 months ago

Nah. This isn’t guilt by association

In her decision, the judge said that the plaintiffs may proceed with their lawsuit, which claims social media companies — like Meta, Alphabet, Reddit and 4chan — ”profit from the racist, antisemitic, and violent material displayed on their platforms to maximize user engagement,”

Which despite their denials the actually know: https://www.nbcnews.com/tech/tech-news/facebook-knew-radicalized-users-rcna3581

[–] Arbiter@lemmy.world 30 points 8 months ago

Yeah, but algorithmic delivery of radicalizing content seems kinda evil though.

[–] rambaroo@lemmynsfw.com 23 points 8 months ago* (last edited 8 months ago) (3 children)

I don't think you understand the issue. I'm very disappointed to see that this is the top comment. This wasn't an accident. These social media companies deliberately feed people the most upsetting and extreme material they can. They're intentionally radicalizing people to make money from engagement.

They're absolutely responsible for what they've done, and it isn't "by proxy", it's extremely direct and deliberate. It's long past time that courts held them liable. What they're doing is criminal.

load more comments (3 replies)
[–] WarlordSdocy@lemmy.world 19 points 8 months ago (5 children)

I think the distinction here is between people and businesses. Is it the fault of people on social media for the acts of others? No. Is it the fault of social media for cultivating an environment that radicalizes people into committing mass shootings? Yes. The blame here is on the social medias for not doing more to stop the spread of this kind of content. Because yes even though that won't stop this kind of content from existing making it harder to access and find will at least reduce the number of people who will go down this path.

load more comments (5 replies)
[–] snooggums@midwest.social 15 points 8 months ago (3 children)

Systemic problems require systemic solutions.

load more comments (3 replies)
load more comments (34 replies)
[–] Simulation6@sopuli.xyz 98 points 8 months ago (14 children)

Add Fox news and Trump rallies to the list.

load more comments (14 replies)
[–] Phanatik@kbin.social 72 points 8 months ago* (last edited 8 months ago) (9 children)

I don't understand the comments suggesting this is "guilty by proxy". These platforms have algorithms designed to keep you engaged and through their callousness, have allowed extremist content to remain visible.

Are we going to ignore all the anti-vaxxer groups who fueled vaccine hesitancy which resulted in long dead diseases making a resurgence?

To call Facebook anything less than complicit in the rise of extremist ideologies and conspiratorial beliefs, is extremely short-sighted.

"But Freedom of Speech!"

If that speech causes harm like convincing a teenager walking into a grocery store and gunning people down is a good idea, you don't deserve to have that speech. Sorry, you've violated the social contract and those people's blood is on your hands.

[–] firadin@lemmy.world 29 points 8 months ago (8 children)

Not just "remain visible" - actively promoted. There's a reason people talk about Youtube's right-wing content pipeline. If you start watching anything male-oriented, Youtube will start slowly promoting more and more right-wing content to you until you're watching Ben Shaprio and Andrew Tate

[–] Ragnarok314159@sopuli.xyz 18 points 8 months ago (2 children)

I got into painting mini Warhammer 40k figurines during covid, and thought the lore was pretty interesting.

Every time I watch a video, my suggested feed goes from videos related to my hobbies to entirely replaced with red pill garbage. The right wing channels have to be highly profitable to YouTube to funnel people into, just an endless tornado of rage and constant viewing.

load more comments (2 replies)
[–] BeMoreCareful@lemmy.world 13 points 8 months ago

YouTube is really bad about trying to show you right wing crap. It's overwhelming. The shorts are even worse. Every few minutes there's some new suggestion for some stuff that is way out of the norm.

Tiktok doesn't have this problem and is being attacked by politicians?

load more comments (6 replies)
load more comments (8 replies)
[–] Nomad@infosec.pub 69 points 8 months ago

Nice, now do all regigions and churches next

[–] Socsa@sh.itjust.works 68 points 8 months ago (2 children)

Please let me know if you want me to testify that reddit actively protected white supremacist communities and even banned users who engaged in direct activism against these communities

[–] FenrirIII@lemmy.world 37 points 8 months ago (3 children)

I was banned for activism against genocide. Reddit is a shithole.

load more comments (3 replies)
load more comments (1 replies)
[–] porksoda@lemmy.world 67 points 8 months ago* (last edited 8 months ago) (5 children)

Back when I was on reddit, I subscribed to about 120 subreddits. Starting a couple years ago though, I noticed that my front page really only showed content for 15-20 subreddits at a time and it was heavily weighted towards recent visits and interactions.

For example, if I hadn't visited r/3DPrinting in a couple weeks, it slowly faded from my front page until it disappeared all together. It was so bad that I ended up writing a browser automation script to visit all 120 of my subreddits at night and click the top link. This ended up giving me a more balanced front page that mixed in all of my subreddits and interests.

My point is these algorithms are fucking toxic. They're focused 100% on increasing time on page and interaction with zero consideration for side effects. I would love to see social media algorithms required by law to be open source. We have a public interest in knowing how we're being manipulated.

[–] Fedizen@lemmy.world 15 points 8 months ago (1 children)

I used google news phone widget years ago and clicked on a giant asteroid article, and for whatever reason my entire feed became asteroid/meteor articles. Its also just such a dumb way to populate feeds.

load more comments (1 replies)
load more comments (4 replies)
[–] skozzii@lemmy.ca 64 points 8 months ago (9 children)

YouTube feeds me so much right wing bullshit I'm constantly marking it as not interested. It's a definite problem.

[–] Duamerthrax@lemmy.world 13 points 8 months ago

It's amazing how often I get a video from some right wing source suggested to me companting about censorship and being buried by youtube. I ended up installing a third party channel blocker to deal with it.

load more comments (8 replies)
[–] Krudler@lemmy.world 50 points 8 months ago* (last edited 8 months ago) (2 children)

I just would like to show something about Reddit. Below is a post I made about how Reddit was literally harassing and specifically targeting me, after I let slip in a comment one day that I was sober - I had previously never made such a comment because my sobriety journey was personal, and I never wanted to define myself or pigeonhole myself as a "recovering person".

I reported the recommended subs and ads to Reddit Admins multiple times and was told there was nothing they could do about it.

I posted a screenshot to DangerousDesign and it flew up to like 5K+ votes in like 30 minutes before admins removed it. I later reposted it to AssholeDesign where it nestled into 2K+ votes before shadow-vanishing.

Yes, Reddit and similar are definitely responsible for a lot of suffering and pain at the expense of humans in the pursuit of profit. After it blew up and front-paged, "magically" my home page didn't have booze related ads/subs/recs any more! What a totally mystery how that happened /s

The post in question, and a perfect "outing" of how Reddit continually tracks and tailors the User Experience specifically to exploit human frailty for their own gains.

Edit: Oh and the hilarious part that many people won't let go (when shown this) is that it says it's based on my activity in the Drunk reddit which I had never once been to, commented in, posted in, or was even aware of. So that just makes it worse.

[–] mlg@lemmy.world 18 points 8 months ago (1 children)

Its not reddit if posts don't get nuked or shadowbanned by literal sitewide admins

load more comments (1 replies)
load more comments (1 replies)
[–] ristoril_zip@lemmy.zip 45 points 8 months ago (7 children)

"Noooo it's our algorithm we can't be held liable for the program we made specifically to discover what people find a little interesting and keep feeding it to them!"

[–] RagingRobot@lemmy.world 16 points 8 months ago (5 children)

I wonder if you built a social media site where the main feature was that the algorithm just showed you things in sequential order like in the old days, would it be popular

[–] RaoulDook@lemmy.world 14 points 8 months ago

I enjoy using Lemmy mostly that way, just sorting the feed by new / hot / whatever and looking at new posts of random shit. Much more entertaining than video-spamming bullshit.

load more comments (4 replies)
load more comments (6 replies)
[–] The_Tired_Horizon@lemmy.world 25 points 8 months ago (6 children)

I gave up reporting on major sites where I saw abuse. Stuff that if you said that in public, also witnessed by others, you've be investigated. Twitter was also bad for responding to reports with "this doesnt break our rules" when a) it clearly did and b) probably a few laws.

[–] reverendsteveii@lemm.ee 20 points 8 months ago* (last edited 8 months ago) (4 children)

I gave up after I was told that people DMing me photographs of people committing suicide was not harassment but me referencing Yo La Tengo's album "I Am Not Afraid Of You And I Will Beat Your Ass" was worthy of a 30 day ban

load more comments (4 replies)
load more comments (5 replies)
[–] Jaysyn@kbin.social 25 points 8 months ago* (last edited 8 months ago) (4 children)

Good.

There should be no quarter for fascists, violent racist or their enablers.

Conspiracy for cash isn't a free speech issue.

load more comments (4 replies)
[–] scottmeme@sh.itjust.works 22 points 8 months ago (1 children)

Excuse me what in the Kentucky fried fuck?

As much as everyone says fuck these big guys all day this hurts everyone.

[–] athos77@kbin.social 22 points 8 months ago* (last edited 8 months ago) (4 children)

I agree with you, but ... I was on reddit since the Digg exodus. It always had it's bad side (violentacrez, jailbait, etc), but it got so much worse after GamerGate/Ellen Pao - the misogyny became weaponized. And then the alt-right moved in, deliberately trying to radicalize people, and we worked so. fucking. hard to keep their voices out of our subreddits. And we kept reporting users and other subreddits that were breaking rules, promoting violence and hatred, and all fucking spez would do is shrug and say, "hey it's a free speech issue", which was somewhere between "hey, I agree with those guys" and "nah, I can't be bothered".

So it's not like this was something reddit wasn't aware of (I'm not on Facebook or YouTube). They were warned, repeatedly, vehemently, starting all the way back in 2014, that something was going wrong with their platform and they need to do something. And they deliberately and repeatedly choose to ignore it, all the way up to the summer of 2021. Seven fucking years of warnings they ignored, from a massive range of users and moderators, including some of the top moderators on the site. And all reddit would do is shrug it's shoulders and say, "hey, free speech!" like it was a magic wand, and very occasionally try to defend itself by quoting it's 'hate speech policy', which they invoke with the same regular repetitiveness and 'thoughts and prayers' inaction as a school shooting brings. In fact, they did it in this very article:

In a statement to CNN, Reddit said, “Hate and violence have no place on Reddit. Our sitewide policies explicitly prohibit content that promotes hate based on identity or vulnerability, as well as content that encourages, glorifies, incites, or calls for violence or physical harm against an individual or group of people. We are constantly evaluating ways to improve our detection and removal of this content, including through enhanced image-hashing systems, and we will continue to review the communities on our platform to ensure they are upholding our rules.”

As someone who modded for a number of years, that's just bullshit.

Edit: fuck spez.

load more comments (4 replies)
[–] Fedizen@lemmy.world 22 points 8 months ago (1 children)

media: Video games cause violence

media: Weird music causes violence.

media: Social media could never cause violence this is censorship (also we don't want to pay moderators)

load more comments (1 replies)
[–] Zuberi@lemmy.dbzer0.com 19 points 8 months ago

Fuck Reddit, can't wait to see the IPO burn

[–] Not_mikey@slrpnk.net 18 points 8 months ago (4 children)

Sweet, I'm sure this won't be used by AIPAC to sue all the tech companies for causing October 7th somehow like unrwa and force them to shutdown or suppress all talk on Palestine. People hearing about a genocide happening might radicalize them, maybe we could get away with allowing discussion but better safe then sorry, to the banned words list it goes.

This isn't going to end in the tech companies hiring a team of skilled moderators who understand the nuance between passion and radical intention trying to preserve a safe space for political discussion, that costs money. This is going to end up with a dictionary of banned and suppressed words.

load more comments (4 replies)
[–] charonn0@startrek.website 17 points 8 months ago

I think there's definitely a case to be made that recommendation algorithms, etc. constitute editorial control and thus the platform may not be immune to lawsuits based on user posts.

I will testify under oath with evidence that Reddit, the company, has not only turned a blind eye to but also encouraged and intentfully enabled radicalization on their platform. It is the entire reason I am on Lemmy. It is the entire reason for my username. It is the reason I questioned my allyship with certain marginalized communities. It is the reason I tense up at the mention of turtles.

[–] Kalysta@lemmy.world 14 points 8 months ago (4 children)

Love Reddit’s lies about them taking down hateful content when they’re 100% behind Israel’s genocide of the Palestinians and will ban you if you say anything remotely negative about Israel’s govenment. And the amount of transphobia on the site is disgusting. Let alone the misogyny.

load more comments (4 replies)
load more comments
view more: next ›