this post was submitted on 10 Jan 2024
128 points (98.5% liked)

Games

16796 readers
973 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 1 year ago
MODERATORS
 

Just a day after Unity announced it would be laying off 1,800 employees as part of an ongoing "company reset", it's bei…

you are viewing a single comment's thread
view the rest of the comments
[–] vexikron@lemmy.zip 11 points 10 months ago* (last edited 10 months ago) (1 children)

Content Moderation at this kind of scale is:

  1. Impossible to do without both humans and algorithms

  2. Always going to produce absurd, hypocritical and controversial interpretations of and changes to the TOS

  3. Cause a horrific workload for those programming the algorithm, who must devise ways of screening for not actually well defined and constantly changing TOS no nos

  4. Literally traumatize and cause massive mental damage to the human moderators

Facebook has these problems on an even worse scale, and still operates basically computer equipped sweatshops of hundreds and thousands of people in less economically developed parts of the world, most of whom report massive mental trauma from having to constantly review absolutely horrific content, day in, day out, for years.

[–] Kalkaline@leminal.space 2 points 10 months ago (1 children)

100% agree, but the alternative is giving the average user moderation power and hoping they do a good job at it, or not moderate at all.

[–] vexikron@lemmy.zip 2 points 10 months ago* (last edited 10 months ago)

Thats /an/ alternative.

Another alternative is /social networks this large should not exist/.

There are many, many other alternatives.

Its just that social networks this large have basically destroyed the brains of people who use them, so now they can hardly imagine alternatives.

And that is /another/ argument for why they shouldnt exist, the fact that they normalize themselves they way social and cultural institutions do, but with no actual accountability the way that local and state governments at least theoretically do.

This is also an explanation of why such things are not likely to go away. In addition to being addictive at an individual level, the network effect causes peer pressure to engage more, and otherisizes those who do not and makes them social outcasts, at least amongst the relevant ages ranges for given platforms, but this has also already become more pervasive in matters of direct economic importance, with many companies not hiring, and apartments not renting if they cannot first verify your social media presence on these large platforms.

To slightly inaccurately quote Morpheus from Deus Ex:

The human being desires judgement, without this, group cohesion is impossible, and thus, civilization.

At first you (humans) worshipped Gods, then, the fame and fortune of others. Next, it will be autonomous systems of surveillance, pervasive everywhere.

Welp, turns out that real life mass scale social media networks literally are a hybrid or synthesis of the elements of the latter two mechanisms of social reverence/judgement.