this post was submitted on 26 Feb 2024
38 points (93.2% liked)
Games
16785 readers
850 users here now
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
- News oriented content (general reviews, previews or retrospectives allowed).
- Broad discussion posts (preferably not only about a specific game).
- No humor/memes etc..
- No affiliate links
- No advertising.
- No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
- No self promotion.
- No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
- No politics.
Comments.
- No personal attacks.
- Obey instance rules.
- No low effort comments(one or two words, emoji etc..)
- Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
"Like" it? No, but it runs way better, and if you are using a high-resolution display, the quality upsampling methods are pretty decent on most games unless you are pixel peeping. I'd rather get 90+ fps with FSR3/DLSS3 with a 5 percent decrease in visual quality over ~45 fps at native resolution.
My qualm is all of the visual artificing I see. Maybe it's just the games I play, but there are some pretty bad graphical glitches that bother me, and the frame timing is off or something because it makes the game feel less smooth. Part of the smoothness is probably the relatively weak CPU in my laptop. But even on my desktop the frame pacing doesn't feel the same as native.
I think preferring a lower-than-native resolution over DLSS as a blanket statement is a bit of a wild take, but there can definitely be problems like artifacts, especially in certain games. I'm playing RDR2 at the moment and the TAA (which is forced to High with DLSS) is poorly implemented and causes flickers which is definitely annoying, as an example. I played Alan Wake 2 on an older laptop that barely ran it and I definitely noticed artifacting from DLSS there, though in fairness I was demanding a lot from that machine by forcing it to play AW2.
Frame time will of course be impacted so if you're playing something really fast and twitchy you should stay away from DLSS probably. It's also less bad if you don't enable Frame Generation. Finally, both DLSS and Frame Generation input lag seems to scale with your baseline FPS. Using it to try to reach 60+ FPS will usually mean some input lag, using it when you're already at ~60 FPS to get 80-100 or whatever means less noticeable input lag.
In most cases DLSS actually reduces your input lag because you're getting a higher framerate. Not sure what you're talking about.
https://youtu.be/osLDDl3HLQQ?t=219
Might be just frame generation I was thinking of.
Yeah frame generation is crap, I wouldn't ever use it.
I don't think frame generation is crap outright, it's still free frames, it's just only really useful when you're already at a solid frame rate.
It's not really free frames though. It's no different to motion interpolation on TVs that makes everything look soap opera-like. The game is still playing at the lower framerate, and the disconnect between your input running at one framerate but what you're seeing looking like it's running at another is going to feel "off".
I've had a horrible experience with fsr, but dlss I haven't noticed a single issue and always turn it on
I tested it in BG3 and it didn't work very well with trees and other small objects.
DLSS has nothing to do with the frame timing. DLSS also has very little, if any, visible "graphical glitches".