this post was submitted on 10 Oct 2025
78 points (96.4% liked)

Linux

61465 readers
397 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 6 years ago
MODERATORS
 

I have this question. I see people, with some frequency, sugar coating the Nvidia GPU marriage with Linux. I get that if you already have a Nvidia GPU or you need CUDA or work with AI and want to use Linux that is possible. Nevertheless, this still a very questionable relationship.

Shouldn’t we be raising awareness about in case one plan to game titles that uses DX12? I mean 15% to 30% performance loss using Nvidia compared to Windows, over 5% to 15% and some times same performance or better using AMD isn't something to be alerting others?

I know we wanna get more people on Linux, and NVIDIA’s getting better, but don’t we need some real talk about this? Or is there some secret plan to scare people away from Linux that I missed?

Am I misinformed? Is there some strong reason to buy a Nvidia GPU if your focus is gaming in Linux?

Edit: I'm adding some links with the issue in question because I see some comments talking about Nvidia to be working flawless:

https://forums.developer.nvidia.com/t/directx12-performance-is-terrible-on-linux/303207

https://www.reddit.com/r/linux_gaming/comments/1nr4tva/does_the_nvidia_dx12_bug_20ish_performance_loss/

Please let me know if this is already fixed on Nvidia GPUs for gaming in Linux.

top 50 comments
sorted by: hot top controversial new old
[–] data1701d@startrek.website 47 points 3 months ago (4 children)

I feel like most people who use Nvidia on Linux just got their machine before they were Linux users, with a small subset for ML stuff.

Honestly, I hear ROCm may finally be getting less horrible, is getting wider distro support, and supports more GPUs than it used to, so I really hope AMD will become as livable ML dev platform as it is a desktop GPU.

[–] SkabySkalywag@lemmy.world 10 points 3 months ago (1 children)

That is correct in my case, startedwith linux earlier this year. Will be switching to AMD for the next upgrade.

[–] warmaster@lemmy.world 5 points 3 months ago (3 children)

I did this.

From:

Intel i7 14700K + 3080 TI

To:

Ryzen 7700X + RX 7900 XTX.

The difference on Wayland is very big.

load more comments (3 replies)
[–] utopiah@lemmy.ml 2 points 3 months ago (4 children)

Yep, that'd be me. That said if I were to buy a new GPU today (well, tomorrow, waiting on Valve announcement for its next HMD) I might still get an NVIDIA because even though I'm convinced 99% of LLM/GenAI is pure hype, if 1% might be useful, might be built ethically and might run on my hardware, I'd be annoyed if it wouldn't because ROCm is just a tech demo but is too far performance wise. That'd say the percentage is so ridiculously low I'd probably pick the card which treats the open ecosystem best.

[–] MalReynolds@piefed.social 4 points 3 months ago* (last edited 3 months ago)

ROCm works just fine on consumer cards for inferencing and is competetive or superior in $/Token/s and beats NVIDIA power consumption. ROCm 7.0 seems to be giving >2x uplift on consumer cards over 6.9, so that's lovely. Haven't tried 7 myself yet, waiting for the dust to settle, but I have no issues with image gen, text gen, image tagging, video scanning etc using containers and distroboxes on Bazzite with a 7800XT.

Bleeding edge and research tends to be CUDA, but mainstream use cases are getting ported reasonably quickly. TLDR unless you're training or researching (unlikely on consumer cards) AMD is fine and performant, plus you get stable linux and great gaming.

[–] FauxLiving@lemmy.world 3 points 3 months ago (1 children)

I use local ai for speech/object recognition from my video security system and control over my HomeAssistant and media services. These services are isolated from the Internet for security reasons, that wouldn’t be possible if they required OpenAI to function.

ChatGPT and Sora are just tech toys, but neural networks and machine learning are incredibly useful components. You would be well served by staying current on the technology as it develops.

load more comments (1 replies)
[–] data1701d@startrek.website 2 points 3 months ago

From what I've heard, ROCm may be finally getting out of its infancy; at the very least, I think by the time we get something useful, local, and ethical, it will be pretty well-developed.

Honestly, though, I'm in the same boat as you and actively try to avoid most AI stuff on my laptop. The only "AI" thing I use is I occasionally do an image upscale. I find it kind of useless on photos, but it's sometimes helpful when doing vector traces on bitmap graphics with flat colors; Inkscape's results aren't always good with lower resolution images, so putting that specific kind of graphic through "cartoon mode" upscales sometimes improves results dramatically for me.

Of course, I don't have GPU ML acceleration, so it just runs on the CPU; it's a bit slow, but still less than 10 minutes.

load more comments (1 replies)
load more comments (2 replies)
[–] megopie@lemmy.blahaj.zone 38 points 3 months ago (6 children)

I’d say in general, the advantages of Nvidia cards are fairly niche even on windows. Like, multi frame generation (fake frames) and upscaling are kind of questionable in terms of value add most of the time, and most people probably aren’t going to be doing any ML stuff on their computer.

AMD in general offers better performance for the money, and that’s doubly so with Nvidia’s lackluster Linux support. AMD has put the work in to get their hardware running well on Linux, both in terms of work from their own team and being collaborative with the open source community.

I can see why some people would choose Nvidia cards, but I think, even on windows, a lot of people who buy them probably would have been better off with AMD. And outside of some fringe edge cases, there is no good reason to choose them when building or buying a computer you intend to mainly run Linux on.

[–] filister@lemmy.world 9 points 3 months ago (5 children)

Even though I hate Nvidia, they have a couple of advantages:

  • CUDA
  • Productivity
  • Their cards retain higher resale values

So if you need this card for productivity and not only gaming, Nvidia is probably better, if you buy second hand or strictly for gaming, AMD is better.

load more comments (5 replies)
load more comments (5 replies)
[–] CodenameDarlen@lemmy.world 34 points 3 months ago

I'd say given the AMD contributions to Linux better to support them with your money instead of NVIDIA.

[–] mybuttnolie@sopuli.xyz 34 points 3 months ago (3 children)

yes, HDMI 2.1. if you use a tv as a monitor, you won't get 4k120 with amd cards on linux because hdmi forum is assholes

[–] wonderfulvoltaire@lemmy.world 10 points 3 months ago (1 children)

I have a 6900xt and it has output for 4k 120 and I never had issues with it on multiple distros. Lately Bazzite has been behaving as expected so I don't know where this information is coming from besides the argument that HDMI is closed source as opposed to DisplayPort.

[–] mybuttnolie@sopuli.xyz 10 points 3 months ago (1 children)

hdmi 2.0 doesn't have the bandwidth for 4k120, displayport and hdmi 2.1 do. amd drivers don't have hdmi 2.1 driver, because the hdmi forum didn't allow amd to use it in their open source linux driver. you still get 4k120 with dp and even on hdmi if you use limited colorspace

[–] wonderfulvoltaire@lemmy.world 3 points 3 months ago (1 children)

I agree I mentioned the lack of open source support from HDMI 2.1 because that company is exceptionally lazy.

[–] chronicledmonocle@lemmy.world 3 points 3 months ago

It's greed. Not laziness.

[–] Amaterasu@lemmy.world 6 points 3 months ago

That is a fair reason and a good remind actually. Thanks!

[–] Giooschi@lemmy.world 3 points 3 months ago (1 children)

You can still get it if you use DisplayPort though, no?

[–] mybuttnolie@sopuli.xyz 5 points 3 months ago

yes, but TVs don't have DP

[–] Core_of_Arden@lemmy.ml 33 points 3 months ago

I use AMD, where ever it is possible. Simply because they support Linux. There's really no other reason needed. I don't care about CUDA or anything else, that is vaguely not relevant. I'd rather drive a medium car, that gives me freedom, than a high end car, that ties me down.

[–] just_another_person@lemmy.world 17 points 3 months ago (1 children)

AMD will have superior support and better power management out of the box hands down.

Nvidia may have a minor performance improvement in some areas depending on the card, but not in a way you would care if you aren't obsessed with the technical specifics of the graphics on AAA games.

I've been on Linux as a dev and daily driver for 20 years, and Nvidia drivers are just problematic unless you know exactly how to fix them when there are issues. That's an Nvidia problem, not a Linux problem. Cuda on AMD is also a thing if you want to go that route.

The choice is yours.

[–] vinnymac@lemmy.world 3 points 3 months ago

I’m glad you mentioned knowing how to fix them. My server has hosted Nvidia GPUs for 15 odd years now, working great, and has remained stable through updates by some miracle.

Getting it set up was a nightmare back then though, do not recommend for the faint of heart.

[–] LeFantome@programming.dev 14 points 3 months ago (1 children)

I think the answer is if you are shooting for the high-end. AMD is better cost / performance but NVIDIA is still unchallenged for absolute performance if budget is not a consideration.

And if you need CUDA…

[–] Amaterasu@lemmy.world 3 points 3 months ago* (last edited 3 months ago)

I agree with that, because there is no offering from AMD to compete with the high-end Nvidia absolute performance GPU.

[–] Lukemaster69@lemmy.ca 12 points 3 months ago

it is better to go with AMD because AMD drivers are built into the iso and less headache for gaming

[–] raspberriesareyummy@lemmy.world 10 points 3 months ago

No, nvidia are evil unreliable pieces of shit.

[–] LeFantome@programming.dev 7 points 3 months ago (2 children)

Two pretty massive facts for anybody trying to answer this question:

  1. Since driver version 555, explicit sync has been supported. This makes a massive difference to the experience on Wayland. Most of the problems people report are for drivers earlier than this (eg. black screens and flicker).

  2. Since driver version 580, NVIDIA uses Open Source modules to interact with the kernel. These are not Open Source drivers. They are the proprietary drivers from NVIDIA that should now “just work” across kernel upgrades (like AMD has forever). This solves perhaps the biggest hassle of dealing with NVIDIA on Linux.

Whether you get to enjoy these significant improvements depends on how long it takes stuff to make it to your distribution. If you are on Arch, you have this stuff today. If you are on Debian, you are still waiting (even on Debian 13).

This is not an endorsement of either distro. They are simply examples of the two extremes regarding how current the software versions are in those distros. Most other distros fall somewhere in the middle.

All this stuff will make it to all Linux users eventually. They are solved problems. Just not solved for everyone.

[–] UntouchedWagons@lemmy.ca 4 points 3 months ago

Does KMS work with an nvidia gpu now? I remember ages ago the boot sequence would be stuck at 640x480 until X started.

load more comments (1 replies)
[–] daggermoon@lemmy.world 6 points 3 months ago* (last edited 3 months ago) (3 children)

~~If you want to use Linux, please choose AMD. I helped install CachyOS on my sister's RTX 5080 system and its horrible. 40% performance loss. She's going back to Windows.~~

Edit: Not entirely accurate

load more comments (3 replies)
[–] herseycokguzelolacak@lemmy.ml 6 points 3 months ago
[–] melfie@lemy.lol 6 points 3 months ago* (last edited 3 months ago)

NVIDIA definitely dominates for specialized workloads. Look at these Blender rendering benchmarks and notice AMD doesn’t appear until page 3. Wish there were an alternative to NVIDIA Optix that were as fast for path tracing, but there unfortunately is not. Buy an AMD card if you’re just gaming, but you’re unfortunately stuck with NVIDIA if you want to do path traced rendering cost effectively:

https://opendata.blender.org/benchmarks/query/?compute_type=OPTIX&compute_type=CUDA&compute_type=HIP&compute_type=METAL&compute_type=ONEAPI&group_by=device_name&blender_version=4.5.0

Edit:

Here’s hoping AMD makes it to the first page with next generation hardware like Radiance Cores:

https://wccftech.com/amd-unveils-radiance-cores-neural-arrays-universal-compression-next-gen-rdna-gpu-architecture/

[–] Admetus@sopuli.xyz 4 points 3 months ago

I only play older games, opensource games (like Pioneer Space Sim, Luanti), and emulate PS2 mostly (could do PS3/4 you bet) so AMD is fine for my use case and works out of the box. I know Nvidia Linux support has improved which means the latest graphics cards also pretty much work out of the box too. But by principle, I support AMD for the work they put into working on Linux.

[–] muusemuuse@sh.itjust.works 4 points 3 months ago (1 children)
load more comments (1 replies)
[–] filister@lemmy.world 3 points 3 months ago

CUDA acceleration.

[–] notthebees@reddthat.com 3 points 3 months ago

Literally only CUDA. Rocm mostly works.

load more comments
view more: next ›