this post was submitted on 12 Mar 2025
131 points (96.5% liked)

Not The Onion

14791 readers
3003 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Steve@communick.news 2 points 17 hours ago* (last edited 17 hours ago) (1 children)

On one hand the Judge is right. On the other hand the lawyer is right. Then on two more hands, they're both wrong.

Yes, it's bad to legislate by moral panic. Yes, kids are addicted to social media. Those are both facts.

The reason age gating is a bad idea isn't because of moral panic, or "the children". It's because we're ALL addicted to social media. It isn't just the kids, it's adults as well. The problem is the intentionally addicting algorithms, meticulously engendered to keep us scrolling. I'm telling you in 50 years, we'll know how all the social media companies were hiding and lying, about the addictive harmful nature of their business; Just like we know about tobacco and oil companies today.

The best solution I can think of, is to revisit Section 230. You can't hold these companies responsible for what people post to their sites, but we can and must hold them accountable, for what they recommend! If you have a simple easily definable sorting or ranking system of what people choose to follow? You're fine, no accountability for something bad showing up. If you have some black box algorithm of infinite scrolling, based on a complex criteria that nobody can really break down and explain exactly why a specific post was shown to a specific individual? Now you're on the hook for what they see.

[–] Flax_vert@feddit.uk 1 points 17 hours ago (1 children)

I think it would depend on what they recommend. I think some algorithms are fine, like hashtags in common with content you liked or posts from the same person, posts that are overall well liked that day, obviously stuff you follow, etc. But specifically engineering stuff that annoys you to appear, or starting to recommend the same political agenda to everyone regardless of how they interact with the platform, etc, shouldn't be okay.

[–] Steve@communick.news 1 points 16 hours ago* (last edited 16 hours ago)

Yes the idea isn't, that they aren't allowed to recommend anything. It's that they can be held accountable (I.E. sued) if what they recommend, leads to people being radicalized by a hate group, or attempting suicide from cyber bullying. Or even just extra tharapy from doom scrolling ourselves to sleep. Right now Section 230 says they can't be held liable for anything on their sites. Which is obviously stupid.