this post was submitted on 10 Jan 2024
128 points (98.5% liked)

Games

16796 readers
850 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 1 year ago
MODERATORS
 

Just a day after Unity announced it would be laying off 1,800 employees as part of an ongoing "company reset", it's bei…

you are viewing a single comment's thread
view the rest of the comments
[–] Kalkaline@leminal.space 12 points 10 months ago (3 children)

Content moderators would be fairly labor intense.

[–] vexikron@lemmy.zip 11 points 10 months ago* (last edited 10 months ago) (1 children)

Content Moderation at this kind of scale is:

  1. Impossible to do without both humans and algorithms

  2. Always going to produce absurd, hypocritical and controversial interpretations of and changes to the TOS

  3. Cause a horrific workload for those programming the algorithm, who must devise ways of screening for not actually well defined and constantly changing TOS no nos

  4. Literally traumatize and cause massive mental damage to the human moderators

Facebook has these problems on an even worse scale, and still operates basically computer equipped sweatshops of hundreds and thousands of people in less economically developed parts of the world, most of whom report massive mental trauma from having to constantly review absolutely horrific content, day in, day out, for years.

[–] Kalkaline@leminal.space 2 points 10 months ago (1 children)

100% agree, but the alternative is giving the average user moderation power and hoping they do a good job at it, or not moderate at all.

[–] vexikron@lemmy.zip 2 points 10 months ago* (last edited 10 months ago)

Thats /an/ alternative.

Another alternative is /social networks this large should not exist/.

There are many, many other alternatives.

Its just that social networks this large have basically destroyed the brains of people who use them, so now they can hardly imagine alternatives.

And that is /another/ argument for why they shouldnt exist, the fact that they normalize themselves they way social and cultural institutions do, but with no actual accountability the way that local and state governments at least theoretically do.

This is also an explanation of why such things are not likely to go away. In addition to being addictive at an individual level, the network effect causes peer pressure to engage more, and otherisizes those who do not and makes them social outcasts, at least amongst the relevant ages ranges for given platforms, but this has also already become more pervasive in matters of direct economic importance, with many companies not hiring, and apartments not renting if they cannot first verify your social media presence on these large platforms.

To slightly inaccurately quote Morpheus from Deus Ex:

The human being desires judgement, without this, group cohesion is impossible, and thus, civilization.

At first you (humans) worshipped Gods, then, the fame and fortune of others. Next, it will be autonomous systems of surveillance, pervasive everywhere.

Welp, turns out that real life mass scale social media networks literally are a hybrid or synthesis of the elements of the latter two mechanisms of social reverence/judgement.

[–] sugar_in_your_tea@sh.itjust.works -1 points 10 months ago* (last edited 10 months ago) (2 children)

Fair, but surely a lot of that is automated, no? You'd want a human to review it, but it's not like you'd need people watching the streams constantly.

I'm just saying that eliminating 500 people means they have a lot more than 500 people working there, probably well over 2k. That's way bigger than I expected.

[–] Chozo@kbin.social 7 points 10 months ago (1 children)

I think you greatly underestimate how large of a platform Twitch truly is. They have over thirty million daily active users.

[–] sugar_in_your_tea@sh.itjust.works 0 points 10 months ago* (last edited 10 months ago)

Probably. I only watch one streamer, and only occasionally.

That said, headcount shouldn't need to scale much with more users. Look at Valve, which has ~360 employees and hit 33.5M active users, ~11M playing a game. Here's some of what Valve does:

  • hardware products, like Steam Deck and Valve Index
  • Windows compat - Proton; granted, most of the people working on this aren't Valve employees, but contractors Valve pays
  • make games - not often, but there's still maintenance work
  • manage a CDN - not quite as much data as Twitch, but still substantial, and it's certainly in the realm of not being a huge difference in terms of manpower to maintain
  • Steam Link app - available on many of the platforms you listed
  • Steam mobile app
  • Steam app - Linux, Windows, macOS

So Valve has a similar-ish level of complexity with well under 500 employees. Maybe Twitch needs another 100 or so employees to manage the CDN, but surely not another 1500 or more.

[–] hoshikarakitaridia@sh.itjust.works 4 points 10 months ago (1 children)

but surely a lot of that is automated, no?

You know, ppl have tried to automate it in its totality. They've tried to make it 50/50, but it turns out there's not much to automate. Sure you can automate copyright claims on media sources, but that's about it. As soon as there's any complexity to it, human review is necessary. You have to appreciate that content moderation mistakes can have a ripple effect into platform integrity and company image as well as user experience. The risks are easy to underestimate.

[–] sugar_in_your_tea@sh.itjust.works -1 points 10 months ago

Oh sure. I'm just saying that a big chunk of it can be automated, so you're left with manual review of clips that either users or bots generate. That's a big workload, but how many people are we talking? 50? 500? I'm guessing it's closer to 50 than 500, but I don't really know.

[–] PrincessEli@reddthat.com -4 points 10 months ago

So stop doing it then