Hello guys,
I have a problem with Arch Linux. The Graphics Card is being underutilized like around 200W with just around 2400 MHZ. But in Windows its using almost 304W (so the full potential) I am using the latest Mangohud with Proton-GE 9-27. It must some software bug as it is fine in WIndows.
Or might this be a bug from MangoHUD?
My full specs are here:
Operating System: Arch Linux
KDE Plasma Version: 6.3.4
KDE Frameworks Version: 6.13.0
Qt Version: 6.9.0
Kernel Version: 6.14.3-arch1-1 (64-bit)
Graphics Platform: Wayland
Hardware:
Memory: 30.9 GiB of RAM
Graphics Processor: AMD Radeon Graphics (RX 9070 XT
Manufacturer: Micro-Star International Co., Ltd.
Product Name: MS-7E26
System Version: 1.0
I regularly see my GPU (7900xt) sit well below 80% utilization.
Might be because the game is not multi-threaded at all and the one CPU core can’t keep up, or the game is just not demanding full performance.
Looking at the screenshots you provided, one scene has way higher CPU-util than the other. The second secreenshot has CPU-load so high, it backed off from maximum boost.
For Monster Hunter Wilds the performance is roughly the same i think or i might be wrong. But for the other games (Enshrouded, ARK: SA it’s not really the same) I don’t have any references so it could be also WINE but I am not quite sure. But you can see the power and Frequency utilisation is near 304W or more near it)
My rx580 never hits power cap, and the drivers certainly aren’t, uh, fresh on that one.
Hadn’t figured it out. Strikes me as driver thing, since changing system settings doesn’t seem to change anything. I haven’t tried it on Windows to see what would happen.
The RX580 can thermal throttle without reporting a temperature issue. It doesn’t expose hotspot temps, but it does read and respond to them. I’d repaste and retest; also don’t be shy about the amount of paste used, my first repaste went from ~125W to <100W and even lower listed temperatures.
I don’t really have an answer for OP, as I haven’t found many applications where my 9070 non-XT doesn’t pull expected power for expected performance. However, I’m also only using LACT for monitoring.
Which ‘drivers’ are you using? Mesa would be the most probable, then there is AMDVLK or you could try the proprietary AMD drivers. You could also try setting amdgpu.ppfeaturemask=0xffffffff and/or amdgpu.runpm=0 in your boot parms. Also check you are using the latest linux-firmware version.
Thanks, yeah i am using mesa and i am using the latest firmware on Archlinux I will try these kernel parameters to see if it’s making a difference. I will try using amdvlk again lets see.
So I tried the kernel parameters as aswell as AMDVLK. No difference there. What I wonder is why my BIOS version is slightly different from that on TechPowerUps bios collection.
The version currently certified on TechPowerUp ends with US3, but I don’t really know what could be that different from that version (mine ends with US4).
Mesa 25.0.5 is the latest. Does it help? Also mesa 25.1.0 is in the testing repos. Unless you are able to debug weird problems I’d wait for it to hit the stable repos…
Ok I need to clear up. It was TLP that i configured via TLP-UI and enabled some GPU performance before. Disabled all of them. It’s gone now. Could have thought about it before all of this, but it’s solved anyway. This time for sure lol.
I had this problem with my 7900XT. Driver updates in the months following the release improved things quite a bit. With 1yr+ in, all Games run at max 265W if I let them to. I manually set power limits to a lower value cause fan noise and power bill.
So I suspect early-adopter tax on driver things. I went through the same at 7900XT launch.
Patience.
Yeah, settings can limit that too. I run with 1080p windowed (49" widescreen) and 120fps VSync. So card just has plenty of headroom in performance for a lot of stuff. But 5120x1440 fullscreen hits 265W basically all the time for AAA games.
Interestingly, even with the subpar drivers at launch, I managed to get 265W and 3GHz+ on compute jobs. So I at least knew what the chip is capable of