this post was submitted on 16 Dec 2023
36 points (62.5% liked)
Games
16800 readers
789 users here now
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
- News oriented content (general reviews, previews or retrospectives allowed).
- Broad discussion posts (preferably not only about a specific game).
- No humor/memes etc..
- No affiliate links
- No advertising.
- No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
- No self promotion.
- No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
- No politics.
Comments.
- No personal attacks.
- Obey instance rules.
- No low effort comments(one or two words, emoji etc..)
- Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
That's not the point at all though. The point is that it hides good content that a motivated group wants to silence. We had precisely this problem earlier in Lemmy's history where posts critical of China were heavily down voted, not because of quality, but because the group didn't like the message.
Requiring a comment gives context to the negative reaction. It's not a silver bullet, but it should increase the barrier to hiding content, hopefully enough that good, controversial content stays visible.
I'm actually working on a Lemmy alternative that uses a web of trust instead of votes to prioritize and moderate content. Reddit has shown the limitations of voting, and I'm more interested in interesting content than content the majority likes.
Comments do the same thing, by drowning the one opinion in a sea of alternate opinions, and is directly incentivizing only interaction via people with the time to type up a comment. You arent preventing brigades, you are reducing the number of users who arent capable of attempting to brigade at all.
Especially since your version of brigading is literally how communities work. If the group doesnt agree with an opinion, even opinions you do agree with, the opinion is going to be drowned out. You cannot police "opinion quality." Because such a subjective thing is good when it agrees with you and bad when it doesnt.
Good luck and I look forward to seeing it, but to be frank it sounds like you want to build a personal group chat, not a social media site. And like any web of trust, it relies on the integrity of the central member. Which isnt a defense against brigading, just a defense against brigading that doesnt come from the central member or their points of trust.
E: mind, not that theres anything wrong with crafting your own supported super chat. Just that its less social media, and more a hyper evolved chat among friends and friend-of-friends
Maybe at a very high level, but comments have the very obvious advantage that they provide something that moderators can block. Lemmy does have open voting logs, but I highly doubt any decent moderator would feel comfortable blocking people based purely on how they vote, and they'd only actually look if there was an obvious problem (e.g. maybe they need to consider blocking an entire instance).
This only applies to negative interactions, you would always be able to upvote a post.
I think there's an argument for hiding the voting buttons inside of the comment thread so users can't just drive-by vote without actually looking at the comments, much less the linked content, but that's not what I'm arguing for.
You're absolutely right, but you can increase the effort needed to downvote something. A downvote tends to have more weight than an upvote, so it should require more effort as well (e.g. a post with 8 upvotes and 0 downvotes would probably be ranked higher than one with 20 upvotes and 12 downvotes).
No, I definitely want a social media site, I just want everything distributed, including moderation.
Basically, I want something like BitTorrent, but for social media instead of files. That way there's no central authority for pretty much anything, so moderation pretty much has to be opt-in (otherwise you'd pick a different client with different moderation). Ideally, you'd select a moderation team that would filter out bad stuff like CSAM, but not filter out high quality content that you simply disagree with. So you'd pick a diverse set of content moderators to trust, and content would only get filtered out if a certain number of them flagged it. You could use the tools to create an echo chamber for yourself, or you can use it to expose yourself to diverse, high quality content that may challenge your beliefs (my personal preference).
That said, things tend to work differently in practice. At the very least, I'm not going to release it until I have a way for users to review the quality of the moderators they pick.