this post was submitted on 20 Mar 2024
1012 points (98.0% liked)

Technology

59534 readers
3197 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Minotaur@lemm.ee 158 points 8 months ago (74 children)

I really don’t like cases like this, nor do I like how much the legal system seems to be pushing “guilty by proxy” rulings for a lot of school shooting cases.

It just feels very very very dangerous and ’going to be bad’ to set this precedent where when someone commits an atrocity, essentially every person and thing they interacted with can be held accountable with nearly the same weight as if they had committed the crime themselves.

Obviously some basic civil responsibility is needed. If someone says “I am going to blow up XYZ school here is how”, and you hear that, yeah, that’s on you to report it. But it feels like we’re quickly slipping into a point where you have to start reporting a vast amount of people to the police en masse if they say anything even vaguely questionable simply to avoid potential fallout of being associated with someone committing a crime.

It makes me really worried. I really think the internet has made it easy to be able to ‘justifiably’ accuse almost anyone or any business of a crime if a person with enough power / the state needs them put away for a time.

[–] Zak@lemmy.world 64 points 8 months ago (14 children)

I think the design of media products around maximally addictive individually targeted algorithms in combination with content the platform does not control and isn't responsible for is dangerous. Such an algorithm will find the people most susceptible to everything from racist conspiracy theories to eating disorder content and show them more of that. Attempts to moderate away the worst examples of it just result in people making variations that don't technically violate the rules.

With that said, laws made and legal precedents set in response to tragedies are often ill-considered, and I don't like this case. I especially don't like that it includes Reddit, which was not using that type of individualized algorithm to my knowledge.

[–] rambaroo@lemmynsfw.com 3 points 8 months ago (1 children)

Reddit is the same thing. They intentionally enable and cultivate hostility and bullying there to drive up engagement.

[–] deweydecibel@lemmy.world 2 points 8 months ago (1 children)

But not algorithmically catered to the individual.

[–] Kalysta@lemmy.world 1 points 8 months ago

Which is even worse because more people see the bullying and hatred, especially when it shows up on a default sub.

load more comments (12 replies)
load more comments (71 replies)