this post was submitted on 26 Mar 2026
71 points (77.5% liked)

Technology

83094 readers
3134 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] KoboldCoterie@pawb.social 116 points 18 hours ago (5 children)

Here’s a thought experiment: imagine Instagram, but every single post is a video of paint drying. Same infinite scroll. Same autoplay. Same algorithmic recommendations. Same notification systems. Is anyone addicted? Is anyone harmed? Is anyone suing?

Of course not. Because infinite scroll is not inherently harmful. Autoplay is not inherently harmful. Algorithmic recommendations are not inherently harmful. These features only matter because of the content they deliver. The “addictive design” does nothing without the underlying user-generated content that makes people want to keep scrolling.

This feels like an awful argument to make. It's not the presence of those things that make Meta and co so shit, it's the fact that they provably understood the risks and the effects that their design was having, knew that it was harming people, and continued to do it anyway. I don't care if we're talking about a little forum run by a Grandma and Grandpa talking about their jam recipes; if they know that they're causing harm and don't change their behavior, they should be liable.

[–] HeartyOfGlass@piefed.social 39 points 17 hours ago (1 children)

"We designed, marketed, and sold the gun, but we didn't think anyone would use it."

[–] KoboldCoterie@pawb.social 20 points 16 hours ago

It's like if someone had a forum where insurrectionists were discussing how to build bombs and where they were going to use them, and the owners had an internal meeting where they said, "Hey, we're hosting some pretty awful people, should we maybe report them or shut this down?" and the answer was, "Nah, they're paying users, and we want their money."

Pretty sure Section 230 wouldn't protect them, either.

[–] Chulk@lemmy.ml 29 points 17 hours ago (1 children)

Yeah this feels very much like, "censor content, but don't change Meta's practices"

Which begs the question, does the author know what they're cheering for?

[–] Maeve@kbin.earth 6 points 17 hours ago

You can bet they do.

[–] XLE@piefed.social 13 points 17 hours ago

It's like he's describing a slot machine with unpainted wheels, leaving out the context that it's in a casino with a big "paint me and enjoy a share of the profit" sign above it.

The social media machine was designed to be a self-serve addiction generator. It intentionally used every trick it could legally get away with.

[–] avidamoeba@lemmy.ca 8 points 17 hours ago

Also they can now generate content without users, which they already do a lot on Facebook.

[–] lmmarsano@group.lt -2 points 11 hours ago* (last edited 7 hours ago) (1 children)

I don't know. Seems like self-control issues. People can get addicted to anything: shopping, sex, internet use, work, gaming, exercise. I also disagree with prohibitions on gambling, drug use, prostitution: it's their money, their body, etc.

Penalizing systems of communication & information delivery seems overreach. The harm seems phony & averted by basic self-control.

[–] KoboldCoterie@pawb.social 4 points 9 hours ago* (last edited 9 hours ago) (1 children)

Addictive Personality is a proposed set of traits that makes sufferers more vulnerable to developing addictive behaviors, including things like gambling or social media. Does it help to frame it in a different light for you if you think of it as those companies exploiting vulnerable peoples' disorders to extract money from them?

Telling those people to just have self control is like telling someone with depression to just stop being sad.

[–] hitmyspot@aussie.zone 3 points 8 hours ago

Or telling someone stupid to be more clever, as the case may be.