this post was submitted on 11 Aug 2024
37 points (100.0% liked)

Linux

48328 readers
659 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 
SYSTEM INFO: 
- endeavourOS 6.10.3-1-cachyos
- Kernel parameters:
  - nvidia_uvm nvidia_drm.modeset=1 nvidia_drm.fbdev=1 nvidia.NVreg_PreserveVideoMemoryAllocations=1
- KDE Plasma 6.1.4 on Wayland
- AMD Ryzen 7 7800X3D with iGPU enabled on BIOS
- NVIDIA RTX 3080 with nvidia-beta-dkms 560.31.02-1
- Main display plugged on NVIDIA GPU (display port)
- Secondary display plugged on motherboard (display port)

I want to be able to make use of my iGPU (AMD Raphael on a Ryzen 7 7800X3D) to render everything, and run my games on my NVIDIA RTX 3080. It's a desktop PC, I have 2 monitors, my main one is plugged to my 3080 and the secondary is plugged to the motherboard, and by doing so, VRR/Gsync works on my main display (the only workaround to get VRR working with a NVIDIA card and multiple monitors). The second thing I would like to do is for Firefox on my second screen to render/decode with my iGPU, so I can watch videos and streams there without sacrificing my frames on my games (losing about 20 frames playing Helldivers 2 if a video is playing on the background because it uses my dGPU to decode).

I've installed nvidia-prime from AUR and added the udev rules as written on arch wiki, however it seems that my iGPU isn't being used at all

❯ lspci | grep VGA
01:00.0 VGA compatible controller: NVIDIA Corporation GA102 [GeForce RTX 3080 Lite Hash Rate] (rev a1)
11:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Raphael (rev cb)
❯ glxinfo | grep "OpenGL renderer"
OpenGL renderer string: NVIDIA GeForce RTX 3080/PCIe/SSE2

I've noticed that moving my cursor and windows on my desktop is now slower/laggier (check notes below), and my turing smart monitor can't detect my NVIDIA GPU, but that's the extent of the changes. When I open Firefox directly on my second screen and play a video, with nvtop I can see that my NVIDIA GPU is the one doing the work with my iGPU not being utilised at all: https://i.imgur.com/CPPICUk.png (notice how there is no usage at all on my iGPU)

I've got a screenshot while I was playing Helldivers 2 and playing a video at the same time, and if I paused the video I'd get my frames back. Using prime-run to run my games has no difference since everything is being handled by the dGPU anyways.

Is there anything I'm missing here? I'm happy to provide more info but I'm a bit clueless on how to troubleshoot this more so any help is much appreciated.

NOTE: Regarding the choppy cursor, it is not because of GSP firmware. I've tested the proprietary drivers with and without GSP firmware, and the open drivers since 550 and I haven't noticed a single difference in my setup, they all perform exactly the same. Besides, I'm using the beta 560 drivers which are open modules only, which needs GSP firmware enabled.

you are viewing a single comment's thread
view the rest of the comments
[–] Virkkunen@fedia.io 1 points 3 months ago

I've searched a bit about reverse prime, and there's an entry about it on arch wiki, however it seems it's only about X11 configuration and nothing about Wayland or anything else.

Well, at least with my current setup I can get VRR working on my main display without needing to disable my secondary one with my NVIDIA card.