this post was submitted on 02 Sep 2024
754 points (98.1% liked)

Technology

59627 readers
2911 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] OldWoodFrame@lemm.ee 8 points 2 months ago (2 children)

We don't dislike government censorship of CSAM. it's all a spectrum based on the legitimacy of the government order and the legitimacy of the tech billionaire's refusal to abide.

[–] sugar_in_your_tea@sh.itjust.works 3 points 2 months ago (1 children)

Honestly, while I think CSAM is disgusting, I am kind of against government censorship of it. Some go so far as to ban anything resembling CSAM, including imagery that looks like it, but doesn't actually involve a real child. The problem is the abuse required to create it, but if that abuse didn't happen, there is no crime, and it should therefore be completely legal.

The same goes with free speech more broadly. The speech itself should never be illegal, but it should be usable as evidence of another crime. A threat of violence is the crime, and that should be prosecuted, but that shouldn't mean the government should force the host to censor the speech, that should be at the host's discretion. What the government can do is subpoena information relevant to the investigation, but IMO it shouldn't compel any entity to remove content.

That said, Brazilian law isn't the same as US law, and X and Space X should respect the laws of all of the countries in which they operate.

[–] Cryophilia@lemmy.world 3 points 2 months ago (1 children)

That's...actually a pretty reasonable take. Fuck Musk, but you've convinced me that government censorship is just a bad thing in general and that should apply to Musk as much as anyone else.

I do think there's a counter argument to be made that the resources involved in setting up fake accounts to spread bullshit are trivial compared to the resources required to track down and prosecute account owners for crimes, so in a practical sense banning accounts is possibly the only thing one can do (especially if the account owners are foreign). If you give lies the same freedom as truth, you tend to end up with 10 lies for every truth.

[–] Omniraptor@lemm.ee 1 points 2 months ago* (last edited 2 months ago) (1 children)

Op's take is not reasonable imo- if you think threats are harmful enough to prosecute they should also be harmful enough to censor.

Maybe a more soft form of censorship, such as hiding them behind a cw and a "user was vanned for this post" label rather than outright removal, but you can't just do nothing.

[–] Cryophilia@lemmy.world 2 points 2 months ago (1 children)

Prosecution implies a trial before punishment. Censorship is immediate punishment based solely on the judgment of the authorities. That's not a minor difference.

[–] sugar_in_your_tea@sh.itjust.works 1 points 2 months ago (1 children)

Exactly. If a judge states that an individual is no longer allowed on SM, then I absolutely understand banning the account and removing their posts. However, until justice has been served, it's 100% the platform's call, and I think platforms should err on the side of allowing speech.

[–] Cryophilia@lemmy.world 2 points 2 months ago (1 children)

I realize I'm jumping back and forth between sides here, but that's because it's a complex problem and I haven't made my mind up. But that said, to return to the previous point...if you need a court order to ban every spammer and troll, you'll drown in spam and propaganda. The legal system can't keep up.

I'm not saying companies should need a court order, only that they should only be obligated to remove content by a court order, and ideally they'd lean toward keeping content than removing it. I think it's generally better for platforms to enable users to hide content they don't want to see instead of outright removing it for everyone. One person's independent journalism is another person's propaganda, and I generally don't trust big tech companies with agendas to decide between the two.

[–] flashgnash@lemm.ee 1 points 2 months ago (1 children)

I'm willing to bet the people that government wanted were not infact posting CSAM, I'm pretty sure even x would ban them of its own volition pretty quickly if they were doing that

[–] OldWoodFrame@lemm.ee 2 points 2 months ago (1 children)

They weren't, it was just the example at the furthest end of the spectrum. But your framing of "if it was REALLY bad, Twitter would ban it" can not be the solution. We have legitimate governments tasked with governing based on the will of the people, it's not better to just let Elon Musk or Mark Zuckerberg decide the law.

[–] flashgnash@lemm.ee 1 points 2 months ago

They would ban it if was really bad because it's illegal for that stuff to exist and they will face much more serious issues as a company if they don't remove it, they're not doing it out of the goodness of they're hearts

Also not a good look for a company to be hosting that stuff in general for their PR, which is determined entirely by the general population's reaction to their actions and not a small group of individuals in powerful positions