this post was submitted on 18 Mar 2024
94 points (97.0% liked)

Games

16796 readers
850 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 1 year ago
MODERATORS
 

you are viewing a single comment's thread
view the rest of the comments
[–] Maalus@lemmy.world 5 points 8 months ago

Not how it works, and it is a huge science behind it all. First of all, you don't want false positives. People would ruin your game for it. The reviews would be awful and it would breed more cheaters (angry at a game that banned you for no reason? Make it ban you for a reason, ruining people's fun in the process and costing them money). Second, most of what you are talking about is already done on server side. Third, the concept of banwaves is a thing. You want to catch as many cheaters at once with a single detected cheat. If you ban someone at first sight, the cheatmaker will refund that first person and think up something worse immediately. If you ban 30k people, all of them flock to the cheatmaker asking for refunds. Which he can't obviously provide, since they already spent that money over the course of the time the cheat was active, etc. Fourth, lots of cheats are subtle enough to be "invisible" to any sort of detection. Guy has an overlay that shows people through walls. You can't ban overlays and the client needs to know where people are on the server, it just hides them. All you can see is what a human would see - a guy looking at people through walls, but trying to hide it. A guy with "incredible gamesense" basing their tactics on info he couldn't have gotten. A moderator that knows what to look for would see it. An admin that abuses power and bans everyone that's too highly skilled would also ban the cheater. But try writing anything that checks for the "averages" and you ban actually good players that use sound, etc. Same thing with aimbot - it's very obvious to someone looking at gameplay. But going off of statistics you ban everyone who "has a good day".

The way to do it, was how Valve handled it in CSGO. No idea if the system is still in. They basically tasked their community with being the judge and executioner. They would send you a replay in client, showing you 10 mins of the match. Sometimes they would send you a replay that they already know has a blatant cheater in it, to test if you actually say "ban" if you see one. They scored the judges, valuing better ones more and providing feedback saying "your case has banned a cheater". It was a slow process, but effective, or at least it would be if the game wasn't so incredibly popular and free. Obviously a live moderator would help a lot, but it's the next best thing.