this post was submitted on 01 Jun 2024
54 points (96.6% liked)
Games
16806 readers
897 users here now
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
- News oriented content (general reviews, previews or retrospectives allowed).
- Broad discussion posts (preferably not only about a specific game).
- No humor/memes etc..
- No affiliate links
- No advertising.
- No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
- No self promotion.
- No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
- No politics.
Comments.
- No personal attacks.
- Obey instance rules.
- No low effort comments(one or two words, emoji etc..)
- Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Who defines toxic behaviour? Is that definition clearly stated to players in a way that is not hidden? Is every report case of toxic behaviour carefully reviewed by a human?
This is an interesting idea but I can totally see this being maliciously abused.
Late reply, but just so you know...
Before you first launch the game, you must agree to the Riot Games terms of service. The terms very clearly state what is toxic behaviour and are pretty easy to read through. After the tutorial and before you queue for the first time, you must agree to an in game code of conduct, which is a summary of what "[good in game conduct]" (paraphrased) is.
Although it's not confirmed, players seem to be punished based on the volume of in-game reports and some sort of review. When you report a player, there are categories you can choose that describe their conduct. There's also a text box where you can type out what you feel they did.
For text chat violations, this sometimes happens automatically, and even without reports. For example, if you use a racist term, you will be immediately muted in text chat for a time.
Although it hasn't been confirmed, Riot has been trailing a system where they actually record and transcribe in game voice chat. The rumour is that an in game report will trigger an automated and/or manual review of the transcript. For most reports, you'll get a confirmation in a few hours that the player was punished and a thanks for the feedback that will help the community.
Punishments range from a competitive queue cooldown (these get progressively longer the more you repeat the behaviour, and reset after a stretch of good behaviour) to hardware ID bans for the worst cases. A hardware ID ban prevents the player from playing on any account on a PC with the same hardware fingerprint for at least 5mo, and, in some cases, permanently closes accounts that are suspected to be theirs.
If someone bought a bunch of in-game cosmetics, this will very likely cause them to move on to another game. But, of course, the worse offenders will find a way.
And btw, the terms also make it clear that when you buy in game cosmetics, you're actually buying a non-transferable, revocable license to use them in-game. This license can be revoked at any time; for example if you violate the terms of service.
And also, Riot's support site gives players a way to dispute bans, just in case a player was banned by mistake.
It's not perfect (and the game isn't even perfect in any way... far from it) but they at least make it clear what is toxic behaviour, and have put some thought into this system for trying to handle it. I think the video/article is more about stepping up manual review and scale of punishments for the worst offenders.