tal

joined 1 year ago
[–] tal@lemmy.today 4 points 1 week ago (5 children)

The vote was overwhelmingly in favor of the ban, dude, on both sides of the aisle, and every other House member in Wisconsin voted in favor of it. Your one legislator is not the deciding factor on this.

[–] tal@lemmy.today 2 points 1 week ago* (last edited 1 week ago)

Yeah, I agree that the "this particular setting is performance-intensive" thing is helpful. But one issue that developers hit is that when future hardware enters the picture, it's really hard to know what exactly the impact is going to be, because you have to also kind of predict where hardware development is going to go, and you can get that pretty wrong easily.

Like, one thing that's common to do with performance-critical software like games is to profile cache use, right? Like, you try and figure out where the game is generating cache misses, and then work with chunks of data that keep the working set small enough that you can stay in cache where possible.

I've got one of those X3D Ryzen processors where they jacked on-die cache way, way up, to 128MB. I think I remember reading that AMD decided that on the net, the clock tradeoff entailed by that wasn't worth it, and was intending to cut the cache size on the next generation. So a particular task that blows out the cache above a certain data set size -- when you move that slider up -- might have horrendous performance impact on one processor and little impact on another with a huge cache...and I'm not sure that a developer would have been able to reasonably predict that cache sizes would rise so much and then maybe drop.

I remember -- this is a long time ago now -- when one thing that video card vendors did was to disable antialiased line rendering acceleration on "gaming" cards. Most people using 3D cards to do 3D modeling really wanted antialiased lines, because they spent a lot of time looking at wireframes, and wanted them to look nice. They were using the hardware for real work, were less-price sensitive. Video card vendors decided to try and differentiate the product so that they could use price discrimination. Okay, so imagine that you're a game developer and you say that antialiased lines -- which I think most developers would just assume would become faster and faster -- don't have a large performance impact...and then the hardware vendors start disabling the feature on gaming cards, so suddenly cards are maybe slower rendering than earlier cards. Now your guidance is wrong.

Another example: Right now, there are a lot of people who are a lot less price sensitive than most gamers wanting to use cards for parallel compute to run neural nets for AI. What those people care a lot about is having a lot of on-card memory, because that increases the model size that they can run, which can hugely improve the model's capabilities. I would guess that we may see video card vendors try to repeat the same sort of product differentiation, assuming that they can manage to collude to do so, so that they can charge people who want to run those neural nets more money. They might tamp down on how much VRAM they stick on new GPUs aimed at gaming, so that it's not possible to use cheap hardware to compete with their expensive compute cards. If you're a vendor and thinking that blowing, say, 2x to 3x the VRAM current hardware has N years down the line is reasonable for your game, that...might not be a realistic assumption.

I don't think that antialiasing mechanisms are transparent to developers -- I've never written code that uses hardware antialiasing myself, so I could be wrong -- but let's imagine that it is for the sake of discussion. Early antialiasing ran by using what's today called FSAA. That's simple and for most things -- aside from pinpoint bright spots -- very good quality, but gets expensive quickly. Let's say that there was just some API call in OpenGL that let you get a list of available antialiasing options ("2xFSAA", "4xFSAA", etc). Exposing that to the user and saying "this is expensive" would have been very reasonable for a developer -- FSAA was very expensive if you were bounded on nearly any kind of graphics rendering, since it did quadratically-increasing amounts of what the GPU was already doing. But then subsequent antialiasing mechanisms were a lot cheaper. In 2000, I didn't think of future antialiasing algorithm improvements -- I just thought of antialiasing entailing rendering something at high resolution, then scaling it down, doing FSAA. I'd guess that many developers wouldn't either.

[–] tal@lemmy.today 4 points 1 week ago* (last edited 1 week ago)

The base game's campaign was meh, kinda repetitive. The expansions -- and player-made adventures -- improved on it a lot.

[–] tal@lemmy.today 1 points 1 week ago

Yeah, there are auto-calibration systems, but that's why I'm emphasizing "reliably". I've had some of them, for whatever reason, not ramp up quality settings on hardware a decade later even though it can run it smoothly, which is irritating. In fairness to the developers, they can't test on future hardware, but I also don't understand why that happens. Maybe there's some degree of hard-coded assumptions that fall down for some reason down the line.

[–] tal@lemmy.today 1 points 1 week ago

doesn't account for the trade-off between looks and framerate that a player wants,

Yeah, I thought about talking about that in my comment too. Like, maybe a good route would be to have something like a target minimum FPS slider or something. That -- theoretically, if implemented well -- could provide a way to do reasonable settings on a limitred per-player basis without a lot of time investment by the player and without smacking into the "player expects maximum settings to work" issue.

There are also a few people who want the ability to ram quality way up and do not care at all about frame rate for certain things like screenshots, which complicates matters.

I think that one of the big problems is that if any games out there do a "bad" job of choosing settings, which I have seen many games do, it kills player trust in the "auto callibration" feature. So the developers of Game A are impacted by what the developers of Game B do. And there's no real way that they can solve that problem.

[–] tal@lemmy.today 2 points 1 week ago* (last edited 1 week ago) (1 children)

For some reason, Warno didn't grab me and Steel Division 2 did. That being said, I may not have given it a fair chance -- I bailed out on it after a short period of time, probably because SD2 was also available at about the same time. It is true that it's one of the few options out there with a late Cold War setting, like Wargame, so if you like that setting over WW2 -- which is refreshing -- it's certainly worth looking into.

IIRC, one thing that was a little disappointing was that the unit database was a lot smaller than in Wargame: Red Dragon -- I'd kind of taken that, which had been built up across multiple Wargame games, for granted.

[–] tal@lemmy.today 49 points 1 week ago* (last edited 1 week ago) (11 children)

So, I've seen this phenomenon discussed before, though I don't think it was from the Crysis guys. They've got a legit point, and I don't think that this article does a very clear job of describing the problem.

Basically, the problem is this: as a developer, you want to make your game able to take advantage of computing advances over the next N years other than just running faster. Okay, that's legit, right? You want people to be able to jack up the draw distance, use higher-res textures further out, whatever. You're trying to make life good for the players. You know what the game can do on current hardware, but you don't want to restrict players to just that, so you let the sliders enable those draw distances or shadow resolutions that current hardware can't reasonably handle.

The problem is that the UI doesn't typically indicate this in very helpful ways. What happens is that a lot of players who have just gotten themselves a fancy gaming machine, immediately upon getting a game, go to the settings, and turn them all up to maximum so that they can take advantage of their new hardware. If the game doesn't run smoothly at those settings, then they complain that the game is badly-written. "I got a top of the line Geforce RTX 4090, and it still can't run Game X at a reasonable framerate. Don't the developers know how to do game development?"

To some extent, developers have tried to deal with this by using terms that sound unreasonable, like "Extreme" or "Insane" instead of "High" to help to hint to players that they shouldn't be expecting to just go run at those settings on current hardware. I am not sure that they have succeeded.

I think that this is really a UI problem. That is, the idea should be to clearly communicate to the user that some settings are really intended for future computers. Maybe "Future computers", or "Try this in the year 2028" or something. I suppose that games could just hide some settings and push an update down the line that unlocks them, though I think that that's a little obnoxious and would rather not have that happen on games that I buy -- and if a game company goes under, they might never get around to being unlocked. Maybe if games consistently had some kind of really reliable auto-profiling mechanism that could go run various "stress test" scenes with a variety of settings to find reasonable settings for given hardware, players wouldn't head straight for all-maximum settings. That requires that pretty much all games do a good job of implementing that, or I expect that players won't trust the feature to take advantage of their hardware. And if mods enter the picture, then it's hard for developers to create a reliable stress-test scene to render, since they don't know what mods will do.

Console games tend to solve the problem by just taking the controls out of the player's hands. The developers decide where the quality controls are, since players have -- mostly -- one set of hardware, and then you don't get to touch them. The issue is really on the PC, where the question is "should the player be permitted to push the levers past what current hardware can reasonably do?"

[–] tal@lemmy.today 6 points 1 week ago* (last edited 1 week ago)

I don't have a problem with a model where I pay more money and get more content. And I do think that there are certain things that can only really be done with live service that some people will really enjoy -- I don't think that live service shouldn't exist. But I generally prefer the DLC model to the live service model.

  • Live service games probably won't be playable after some point. That sucks if you get invested in them...and live service games do aim at people who are really invested in playing them.

  • I have increasingly shifted away from multiplayer games over the years. Yeah, there are neat things you can do with multiplayer games. Humans make for a sophisticated alternative to AI. But they bring a lot of baggage. Humans mean griefing. Humans mean needing to have their own incentives taken care of -- like, they want to win a certain percentage of the time, aren't just there to amuse other humans. Most real-time multiplayer games aren't pausable, which especially is a pain for people with kids, who may need to deal with random-kid-induced-emergencies at unexpected times. Humans optimize to win in competitive games, and what they do to win might not be fun for other players. Humans may not want to stay in character ("xXxPussySlayer69xXx"), which isn't fantastic for immersion -- and even in roleplay-enforced environments, that places load on other players. Multiplayer games generally require always-online Internet connectivity, and service disruption -- even an increase in latency, for real-time games -- can be really irritating. Humans cheat, and in a multiplayer game, cheating can impact the experience of other players, so that either means dealing with cheating or with anti-cheat stuff that creates its own host of irritations (especially on Linux, as it's often low-level and one of the major remaining sources of compatibility issues).

  • If there are server problems, you can't play.

  • My one foray where I was willing to play a live service game was Fallout 76; Fallout 5 wasn't coming out any time soon, and it was the closest thing that was going to be an option. One major drawback for me was the requirements of making grindable (i.e. inexpensive to develop relative to amount of playtime) multiplayer gameplay was also immersion-breaking -- instead of running around in a world where I can lose myself, I'm being notified that random player has initiated an event, which kind of breaks the suspension of disbelief. It also places constraints on the plot. In prior entrants in the Fallout series, you could significantly change the world, and doing so was a signature of the series. In Fallout 76, you've got a shared world, so that's pretty hard to do, other than in some limited, instanced ways. Not an issue for every type of game out there, but was annoying for that game. Elite: Dangerous has an offline mode that pretends to be faux-online -- again, the game design constraints from being multiplayer kind of limit my immersion.

They do provide a way to do DRM -- if part of the game that you need to play lives on the publisher's servers, then absent reimplementing it, pirates can't play it. And I get that that's appealing for a publisher. But it just comes with a mess of disadvantages.

[–] tal@lemmy.today 6 points 1 week ago* (last edited 1 week ago) (5 children)

Hmm. "Strategy" is pretty broad. Most of the new stuff you have is turn-based, but you've got tactics stuff like X-COM and strategy stuff. If we're including both real-time and turn-based, and both strategy and tactics...What do I enjoy? I tend to lean more towards the milsim side of strategy...

  • Stellaris. Lot of stuff to do here -- follows the Paradox model of a ton of DLCs with content and lots of iteration on the game. Not cheap, though. Turn-based, 4x.

  • Hearts of Iron 4. Another Paradox game. I think unless someone is specifically into World War II grand strategy, I'd recommend Stellaris first, which I'd call a lot more approachable. Real time, grand strategy. I haven't found myself playing this recently -- the sheer scope can be kind of overwhelming, and unlike 4X games like Stellaris, it doesn't "start out small" -- well, not if you're playing the US, at any rate.

  • Carrier Command 2. Feels a little unfinished, but it keeps pulling me back. Really intended to be played multiplayer, but you can play single-player if you can handle the load of playing all of the roles concurrently. Real-time tactics.

  • Rule the Waves 3. Lot of ship design here, fun if you're into gun-era naval combat. Turn-based strategy (light strategy), with real-time tactics combat. Not beautiful. There is a niche of people who are super-into this.

  • I agree with the other user who recommended Steel Division 2. If you've played Wargame: Red Dragon or earlier Eugen games, which are really designed to be played multiplayer, you know that the AI is abysmal. I generally don't like playing multiplayer games, and persisted in playing it single-player. Steel Division 2's AI is actually fun to play against single-player. Real-time tactics, leaning towards the MOBA genre but without heroes and themed with relatively-real-world military hardware.

  • XCOM-alikes. I didn't like XCOM 2 -- it felt way too glizy for me to tolerate, too much time looking at animations, but I may have just not given it a fair chance, as I bailed out after spending only a little time with the game. I have enjoyed turn-based tactics games in the X-COM series and the genre in the past -- squad-based, real-time tactics games. Problem is that I don't know if I can recommend any of them in 2024 -- all the games in that genre I've played are pretty long in the tooth now. Jagged Alliance 2 is fun, but very old. Silent Storm is almost as old, has destructable terrain, but feels low-budget and unpolished. There were a number of attempts to restart the Jagged Alliance series after 2 and a long delay that were not very successful; I understand that Jagged Alliance 3 is supposed to be better, but I don't think I've played through it yet. Wasteland 2 and Wasteland 3 aren't really in the same genre, are more like Fallout 1 and Fallout 2, CRPGs with turn-based tactics combat. But if you enjoy turn-based-tactics, you might also enjoy them, and Wasteland 3 isn't that old.

  • If you like real-time tactics, you might give the Close Combat series a look. I really liked the (now ancient) Close Combat 2. The balance for that game was terrible -- it heavily rewarded use of keeping heavy tanks on hills -- but it was an extremely popular game, and I loved playing it. There are (many) newer games in the series but they started including a strategic layer and a round timer after Close Combat 3. These improved things in the game (and if you like a strategy aspect, you might prefer that), but I just wanted to play the tactics side, and don't feel like the later games every quite had the appeal of the earlier ones. Still, they've certainly had enough to make me come back and replay them.

[–] tal@lemmy.today 2 points 1 week ago* (last edited 1 week ago)

For old school RTS, Total Annihilation

If you don't care about the campaign, probably the much-newer games based on Total Annihilation that run on the Spring engine.

EDIT: Yeah, another user already recommended Zero-K.

[–] tal@lemmy.today 1 points 1 week ago

I mean, it's not beautiful, but for strategy games and other high-replayability games, I don't find that eye candy buys that much. Like, I feel like a good strategy game is one that you should spend a lot of time playing as you master the mechanics, and no matter how pretty the graphics, when you've seen them a ton of times...shrugs I think that eye candy works better for genres where you only see something once, like adventure games, so that the novelty is fresh. But what you like is what you like.

If it's too complicated -- and the game does have a lot of mechanics going on, even by strategy game standards -- Illwinter also has another series, Conquest of Elysium, which is considerably simpler, albeit more RNG-dependent. I personally prefer the latter, even though I know Dominions. Dominions turns into a micromanagement slogfest when you have a zillion armies moving around later in the game. Especially if you have one of the nations that can induce freespawn, like MA Ermor. Huge amounts of time handling troop movement.

It might be more tolerable if you play against other humans -- I mean, if you're playing one turn a day or something, I imagine that it's more tolerable to look at what's going on. But if you're playing against the computer, which is what I do, it has more micromanagement than I'd like.

Trying to optimize your build is neat, though. There are a lot of mutually-exclusive or semi-compatible strategies to use, lots of levers to play with, which I think is a big part of making a strategy game interesting.

I think that Dwarf Fortress has a higher learning curve, but if you're wanting a strategy game that has a gentle learning curve, I agree, Dominions probably isn't the best choice. It also doesn't have a tutorial/introduction system -- it's got an old-school, nice hefty manual.

[–] tal@lemmy.today 5 points 1 week ago

Unciv is a free, open-source reimplementation of Civilization V. It doesn't have all the eye candy and music and such that the series is famous for but as a result of not having it runs responsively on a phone.

view more: ‹ prev next ›