172
NVIDIA: Copyrighted Books Are Just Statistical Correlations to Our AI Models * TorrentFreak
(torrentfreak.com)
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
Posts must be:
Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
"But I NEEEEEEEEED to play my games on 164x AA, ultra textures, unlimited render distance, no optimization, 900 fps (on a 60 fps 720p monitor) MOOOOOMMMM!!!"
That's how I hear every excuse for a recent Nvidia purchase.
I have no doubt that the 4090 is a fantastic piece of hardware, but I just don't see a justification for upgrading.
I play games on a 4k/60 monitor, generally with close to max graphics settings (obviously within reason). My 2080TI handles that just fine. I also couldn't care less about framerate unless the game is noticeably stuttering, so that might help.
I'm still using my same 7700k and 1060 and for 1080p stuff it's just fine, dagnabbit.
I mean, i know for a fact that cyberpunk 2077 barely runs on low with that setup.
It's totally fine if you just play games like slay the spire or enter the gungeon. But that rig, from personal experience, runs high-end graphics like shit. Hell, even elden ring runs like shit on that setup.
....runs great for me. Not sure why it doesn't for you.