this post was submitted on 11 Aug 2024
37 points (100.0% liked)

Linux

48619 readers
1077 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 
SYSTEM INFO: 
- endeavourOS 6.10.3-1-cachyos
- Kernel parameters:
  - nvidia_uvm nvidia_drm.modeset=1 nvidia_drm.fbdev=1 nvidia.NVreg_PreserveVideoMemoryAllocations=1
- KDE Plasma 6.1.4 on Wayland
- AMD Ryzen 7 7800X3D with iGPU enabled on BIOS
- NVIDIA RTX 3080 with nvidia-beta-dkms 560.31.02-1
- Main display plugged on NVIDIA GPU (display port)
- Secondary display plugged on motherboard (display port)

I want to be able to make use of my iGPU (AMD Raphael on a Ryzen 7 7800X3D) to render everything, and run my games on my NVIDIA RTX 3080. It's a desktop PC, I have 2 monitors, my main one is plugged to my 3080 and the secondary is plugged to the motherboard, and by doing so, VRR/Gsync works on my main display (the only workaround to get VRR working with a NVIDIA card and multiple monitors). The second thing I would like to do is for Firefox on my second screen to render/decode with my iGPU, so I can watch videos and streams there without sacrificing my frames on my games (losing about 20 frames playing Helldivers 2 if a video is playing on the background because it uses my dGPU to decode).

I've installed nvidia-prime from AUR and added the udev rules as written on arch wiki, however it seems that my iGPU isn't being used at all

❯ lspci | grep VGA
01:00.0 VGA compatible controller: NVIDIA Corporation GA102 [GeForce RTX 3080 Lite Hash Rate] (rev a1)
11:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Raphael (rev cb)
❯ glxinfo | grep "OpenGL renderer"
OpenGL renderer string: NVIDIA GeForce RTX 3080/PCIe/SSE2

I've noticed that moving my cursor and windows on my desktop is now slower/laggier (check notes below), and my turing smart monitor can't detect my NVIDIA GPU, but that's the extent of the changes. When I open Firefox directly on my second screen and play a video, with nvtop I can see that my NVIDIA GPU is the one doing the work with my iGPU not being utilised at all: https://i.imgur.com/CPPICUk.png (notice how there is no usage at all on my iGPU)

I've got a screenshot while I was playing Helldivers 2 and playing a video at the same time, and if I paused the video I'd get my frames back. Using prime-run to run my games has no difference since everything is being handled by the dGPU anyways.

Is there anything I'm missing here? I'm happy to provide more info but I'm a bit clueless on how to troubleshoot this more so any help is much appreciated.

NOTE: Regarding the choppy cursor, it is not because of GSP firmware. I've tested the proprietary drivers with and without GSP firmware, and the open drivers since 550 and I haven't noticed a single difference in my setup, they all perform exactly the same. Besides, I'm using the beta 560 drivers which are open modules only, which needs GSP firmware enabled.

you are viewing a single comment's thread
view the rest of the comments
[–] Virkkunen@fedia.io 1 points 4 months ago (2 children)

I actually did follow those and the https://wiki.archlinux.org/title/NVIDIA_Optimus I think most of the things are related to laptops where the dGPU can be turned off, but I don't think that's my case since my main monitor is plugged there. I guess what I need to do is find a way to set the iGPU as the default and whatever I need to run on my dGPU i use prime-run, but I'm not having much success with this.

[–] breadsmasher@lemmy.world 2 points 4 months ago (1 children)

I think you would need to plug all your monitors into the iGPU to drive the output, and then Prime would control which GPU a given application runs on, and then returns the rendering to your iGPU just for display.

Are you using X or wayland? I think display servers only support one GPU at a time?

Possibly relevant forum post https://forum.endeavouros.com/t/dual-monitor-support-for-plasma6-wayland-and-hybrid-gpu-intel-nvidia-setup/52017/4

[–] Virkkunen@fedia.io 1 points 4 months ago

I'm running Wayland. I do feel that Plasma is using my iGPU to render the desktop since it's quite noticeable some stutters and lower performance compared to disabling the iGPU and having both monitors on my dGPU, but unfortunately I can't really chose what gets rendered by what. On Windows, this setup works fine, I can chose Firefox to use the "power saving" or whatever and it runs on my iGPU, videos get decoded by it.

I tried plugging my monitors on my motherboard (I have a HDMI and DP outputs) and it works as expected, everything renders on the iGPU and I'd need prime-run for my games, though this is far from ideal since I lose VRR and HDR.

[–] Owljfien@lemm.ee 1 points 4 months ago (1 children)

I've tried to tinker with this too and I got caught wondering, is this what they call reverse prime? I wanted to do dgpu as main, but offload browser etc to amdgpu to avoid the whole nvidia not supporting vaapi conundrum

[–] Virkkunen@fedia.io 1 points 4 months ago

I've searched a bit about reverse prime, and there's an entry about it on arch wiki, however it seems it's only about X11 configuration and nothing about Wayland or anything else.

Well, at least with my current setup I can get VRR working on my main display without needing to disable my secondary one with my NVIDIA card.