this post was submitted on 19 Jan 2025
223 points (98.7% liked)
Games
17247 readers
462 users here now
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
- News oriented content (general reviews, previews or retrospectives allowed).
- Broad discussion posts (preferably not only about a specific game).
- No humor/memes etc..
- No affiliate links
- No advertising.
- No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
- No self promotion.
- No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
- No politics.
Comments.
- No personal attacks.
- Obey instance rules.
- No low effort comments(one or two words, emoji etc..)
- Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
So poorly optimised you need future technology to run it isn't the future proofing strategy I'd go with, but ok...
So, I've seen this phenomenon discussed before, though I don't think it was from the Crysis guys. They've got a legit point, and I don't think that this article does a very clear job of describing the problem.
Basically, the problem is this: as a developer, you want to make your game able to take advantage of computing advances over the next N years other than just running faster. Okay, that's legit, right? You want people to be able to jack up the draw distance, use higher-res textures further out, whatever. You're trying to make life good for the players. You know what the game can do on current hardware, but you don't want to restrict players to just that, so you let the sliders enable those draw distances or shadow resolutions that current hardware can't reasonably handle.
The problem is that the UI doesn't typically indicate this in very helpful ways. What happens is that a lot of players who have just gotten themselves a fancy gaming machine, immediately upon getting a game, go to the settings, and turn them all up to maximum so that they can take advantage of their new hardware. If the game doesn't run smoothly at those settings, then they complain that the game is badly-written. "I got a top of the line Geforce RTX 4090, and it still can't run Game X at a reasonable framerate. Don't the developers know how to do game development?"
To some extent, developers have tried to deal with this by using terms that sound unreasonable, like "Extreme" or "Insane" instead of "High" to help to hint to players that they shouldn't be expecting to just go run at those settings on current hardware. I am not sure that they have succeeded.
I think that this is really a UI problem. That is, the idea should be to clearly communicate to the user that some settings are really intended for future computers. Maybe "Future computers", or "Try this in the year 2028" or something. I suppose that games could just hide some settings and push an update down the line that unlocks them, though I think that that's a little obnoxious and would rather not have that happen on games that I buy -- and if a game company goes under, they might never get around to being unlocked. Maybe if games consistently had some kind of really reliable auto-profiling mechanism that could go run various "stress test" scenes with a variety of settings to find reasonable settings for given hardware, players wouldn't head straight for all-maximum settings. That requires that pretty much all games do a good job of implementing that, or I expect that players won't trust the feature to take advantage of their hardware. And if mods enter the picture, then it's hard for developers to create a reliable stress-test scene to render, since they don't know what mods will do.
Console games tend to solve the problem by just taking the controls out of the player's hands. The developers decide where the quality controls are, since players have -- mostly -- one set of hardware, and then you don't get to touch them. The issue is really on the PC, where the question is "should the player be permitted to push the levers past what current hardware can reasonably do?"
...like most games from early 10's? A lot of them had built-in benchmark that tested what your PC is capable of and then set things up for you.
Yeah, there are auto-calibration systems, but that's why I'm emphasizing "reliably". I've had some of them, for whatever reason, not ramp up quality settings on hardware a decade later even though it can run it smoothly, which is irritating. In fairness to the developers, they can't test on future hardware, but I also don't understand why that happens. Maybe there's some degree of hard-coded assumptions that fall down for some reason down the line.