this post was submitted on 25 Mar 2026
1230 points (99.0% liked)

Technology

83893 readers
3825 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] MyMindIsLikeAnOcean@piefed.world -1 points 3 weeks ago (1 children)

They should be compelled to…sell less ads? Silly. What do you mean by “tools”? There a gajillion tools that nobody understands or uses…we need more responsibility in the purveyor…not the user. Saying you want tools is the status quo.

Moderation is the only solution. Social media companies should be required, with no exceptions, to follow the laws of the region they operate in. They don’t do that…they put out whatever whenever and take almost no responsibility for what they expose people to.

[–] deathbird@mander.xyz 0 points 2 weeks ago (1 children)

By "tools" I generally mean software and options/functionalities offered by that software through the regular user interface that enables one to modify the outputs of that software, and thus one's user experience. So in this sense Windows 11 is a "tool" that as an operating system enables one to use a computer, but also therefore supplies tools to modify the experience, such as one lets a privileged user prevent non-privileged users from uninstalling software or sharing a printer to the LAN, right? Facebook (a software deployment owned and remotely hosted by Meta) has a tool that allows a person with a Javascript-enabled web browser (also a tool) or Meta's proprietary application to send a message to a stranger on the internet, or a known person, along with a lot of other things, right?

Now what Windows 11 doesn't have is a tool that lets me locate my mouse pointer on screen easily, but that's okay because I can install PowerToys to gain that functionality. I can also install software that modifies the Facebook experience to some degree, but there's not a lot of that for various reasons, and certainly I can't find any that sells itself as a child-safety or parental control solution. But that makes sense, right? Because in order to serve that functionality it has to be deployable across all computers the child is using to access that remote service, and it has to be updated to match changes in that service's software, like your shadow is attached to your feet. No practical at all.

Obviously this is of limited use, and this is why people use tools to modify their experience of social media sites like FB are usually doing so merely for their own comfort and enjoyment, which is valid but not the same purpose as parental control. And the relationship between the remote service and the local software developer is adversarial. This is why there's plenty of parental control tools to block a website, but none to modify one.

I actually agree that moderation is the solution, but not in the way you mean. FB doesn't create content, it just facilitates people to share their own (bots too, but set that aside). I don't think any sane person believes that Meta or lemmy [dot] world or any other platform could continue to exist if it was held responsible for what its users said. Platforms make what efforts they do at regulation to avoid getting DMCAed, to keep themselves advertiser-friendly, and to make their services sufficiently enjoyable to users who those advertisers what their ads to be seen by. That last bit's important, but even look at the first two, a legal regulation and "regulation" by market forces in the wild, and you can see how these already cause problems. But what platforms like FB don't give you because they don't want you to have it is control over your user experience.

FB doesn't want you to have tools (account options) to moderate your own or your child's experience on their platform because it would cost them money, both in development costs and opportunity costs. But that's what's actually needed to make FB an enjoyable and even child-safe experience. Not broad legal "moderation" demands that no platform could survive without obscenely invasive company-side tools and exploitative labor outsourcing, but functional tools (that yes, would have to be mandated by law because they won't do it voluntarily) that enable the user to control their own experience. It's a question of, do you want some underpaid and thrice subcontracted Indian/Nigerian tech workers reading your teen's sexts with his boyfriend and making judgment calls as to their appropriateness, or do you want the capacity to simply allow communication between those to accounts without monitoring them, but retain the ability to block DMs from unknown accounts so your kid doesn't get groomed by a stranger? We're constantly told we have to choose between total system control or the Wild West, but we are only encouraged to consider these possibilities because they're what's cheapest for the companies.

[–] MyMindIsLikeAnOcean@piefed.world 0 points 2 weeks ago (1 children)

It just sounds like you’re a lawyer for Facebook. More of the same…more user-end “tools” that nobody uses and get abandoned, more harm to everyone. Down the well we go.

[–] deathbird@mander.xyz 0 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Would have been nice if you'd read what I write, but okay.

What Facebook wants is mandatory age checks at the OS level so they can just call an API and avoid all responsibility within their own platform.

What Facebook doesn't want is users being able to control their own experience of the platform.

[–] MyMindIsLikeAnOcean@piefed.world 0 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I can’t really say it in any more different ways. One last time.

Yes, of course Facebook wants to push unmoderated addictive content on all their users.

But yes, Facebook also loves putting out endless “user tools” so they can push the responsibility off of themselves for the same reason. These tools already exist. Tools are absolutely useless when you’re trying to protect at risk children or people in general…it’s like asking people to be their own doctor.

All social media needs to be regulated at a fundamental level, and that regulation must include each agent being responsible for the content their users post. Putting out more tools so users can block ads or control their kids will make things worse, as the companies continue the arms race for attention. The only people who benefit from tools are helicopter parents and the tech savvy.

[–] deathbird@mander.xyz 1 points 2 weeks ago (1 children)

No I get what you're saying, but your understanding of the world as it exists is incorrect, and your values are for oppression and anti-freedom.

Your incorrect understanding of reality: the on-platform tools that exist currently on Facebook are useless. You are powerless through account settings to limit your exposure to content from strangers on your feed, much less your child's, except by individually blocking accounts as you see them when logged into the account that you want to block from. Even Bluesky, which also has insufficient tools, is slightly better in this regard. But what few on-platform tools you're offerd only exist to give you the illusion of control over your experience. Greater control is possible but not offered because it's less profitable. It could be mandated through law.

Your anti-freedom values: making platforms responsible for user content will destroy them or force severe proactive censorship and real identity policies. None of that is conducive to a free and open society. The fediverse could not exist if servers could be held responsible for what users say or do. Most of the Internet couldn't exist if one rogue or politically unpopular user could land the service they use in court by offending another.

Your last paragraph is complete nonsense. The way to when an arms race is to come in with bigger arms. That's where the government comes in, not to force its own will but to restrain companies and empower people. The notion that giving people greater control of their experiences can harm them is insane.

[–] MyMindIsLikeAnOcean@piefed.world 1 points 2 weeks ago (1 children)

What you’re saying is incoherent.

On one hand you want government mandated tools…on the other you want unlimited freedom.

I just want the content that’s posted on social media platforms to be legal. I don’t know why you’re babbling about restriction of freedom. You want illegal content to be…legal?

[–] deathbird@mander.xyz 1 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I'm not sure what data-speech you personally think should or shouldn't be legal, but I know what kinds a lot of people argue should be illegal: things ranging all the way from videographic records of child abuse (CSAM) to unauthorized copyrighted material to libel to hate speech to blasphemy and plenty else not mentioned. I think some of it is deservedly illegal (e.g. CSAM) and some of it shouldn't be (e.g. blasphemy).

My position is that in a pluralistic society there will be a variety of speech that people won't want to see for various reasons, and they have a right not to see it. They have a right to have tools that allow them to not see things they don't want to see. And government censorship of speech should be limited to the absolute bare minimum of speech that causes material harm, and legal responsibility for those rare instances of illegal speech should fall upon the speaker and not the platform or carrier.

[–] MyMindIsLikeAnOcean@piefed.world 1 points 2 weeks ago (1 children)

The only thing I’m talking about is social Media companies moderating their platforms so there’s zero tolerance on illegal communication. The currently legislated laws in a region.

Currently, in North America, social media companies moderate themselves…typically with user reporting and automation. There’s an hours long gap between infractions and action.

This could be eliminated with proper moderation. I believe this is the bare minimum. The current status quo is the Wild West…children and adults alike are bombarded with illegal content each time they use social media, or the internet at large.

[–] deathbird@mander.xyz 0 points 1 week ago (1 children)

While I'm bombarded by obnoxious content on social media, I very very rarely see content that is illegal in my area. Let's stick a pin in that.

According to a few sources I've seen, 500 hours of video are uploaded to YouTube every minute. Suppose Alphabet was to be held liable for any of that content being illegal. Like strict liability. Would they allow automated systems to check the content, or human eyes? They might use some automation, as a pre-check, but they'd be fools to rely on it, because if it misses something they're on the hook. So how many FTEs would you need to hire just to watch the videos uploaded to YouTube? Not even counting breaks, pauses, double-checking, etc, you're looking at around 30,000 people. Let's say you pay them $15/hr no benefits, that's $10,800,000 per day, close to $4 billion a year, super low balling it because I'm not counting realistic wages, administrative overhead, benefits, or realistic work pace. But maybe Alphabet could still afford it, they grossed $60 billion last year, and while they have lots of other expenses some of that was probably profit. But then I'd ask, could anyone other than Alphabet afford it? Your average PeerTube instance, for instance? Same applies to all the rest.

But back to my first observation. I don't see a lot of stuff that's illegal. I see things that are obnoxious, distracting, etc, but not illegal. But it makes me wonder how you conduct yourself as an adult, or what your perspective on lawful speach is, if you find yourself constantly bombarded by material that you believe is or should be illegal.

[–] MyMindIsLikeAnOcean@piefed.world 0 points 1 week ago* (last edited 1 week ago) (1 children)

You see illegal content all the time, even if you won’t acknowledge you do.

They should be compelled to do whatever it takes so illegal content isn’t available on their platforms. There’s absolutely no reason user posts, and especially advertisements need to be available instantaneously. This notion that doomscrolling in real time or that being able to broadcast your message to the entire world instantly is somehow comparable to freedom of speech or speaking in a town square is absurd.

I couldn’t give a single rats ass how much it costs. And no…your ballpark figure is absurd - don’t make some sky is falling guesstimation that’s right out of the argument of the very people that don’t want their profits infringed on. More money on moderation = Less money to buy elections and legislation with. We transitioned from print and broadcasting that, while somehow respecting free speech,, was pretty fairly regulated. Those“safe”mediums are all but dead because we allowed these social media companies to successfully lobby to avoid all responsibility for illegal content and communication on their platforms.

I’ll give you a practical example of a microcosm: Bots in online games. Bots could be functionally eliminated for the portion of the profits of any MMO. We know they could be because they were…up until the point they realized it’s more profitable to have them. Because they’re not legally compelled to get rid of bots…they’re actually incentivized to allow a certain portion of their population to be bots because it increases engagement. Why? Because we don’t regulate shit…and they companies are drunk on profits and enshitfying everything because there’s no competition in any sector any more (anti-trust, another agent topic: Activision and eventually Microsoft should have never been permitted to purchase a profitable Blizzard).

Take just advertising…let’s put everything aside and agree that social media companies should be responsible for the fucking advertising they post…just like any broadcaster. We can’t even do that. At any given moment advertisers on every social media platform are bombarding users with anonymous advertisements that break every law in the book…from copyright infringement, to election meddling, to sexually explicit material, to illegal gambling and all corners beyond. We can’t even regulate that. It’s fuckery that’s rotting the world. In real time…because we care more about Zuckerbergs bank account than our society.

[–] deathbird@mander.xyz 1 points 1 week ago

While I appreciate your disdain for the titans of industry, the policy you advocate for, for platforms to be responsible for user content, is like tearing up the railways. It reminds me of those ridiculous laws from early in the automobile era where a person was required to sound alarms and wave flags before driving through a city. Policy custom designed to undermine the utility and sustainability of the very thing it is meant to regulate. It would also destroy email, VPS services, VPN services, etc.

I agree with the bit about antitrust though. And holding them responsible for advertisements is a very different question, because they actively solicit and promote advertisements. But otherwise your policy positions are insane.