GPU High Idle Power Usage

I have an RTX 3080 Founders card, and I’ve noticed that it is reporting 75-80W usage while idle - at least, while just using a browser and nothing else going on. I do have 3 displays connected, but this still seems rather high for not doing anything. Running Windows 11.

Searching this issue brings up tons of posts pointing to the power management setting in the Nvidia Control Panel. For me, this is already set to “Normal” and not the setting to maximize performance so no luck there.

I can see when the displays turn off after the inactivity timeout that the GPU usage drops to 5W or so, so that’s something. While sitting on the desktop with the displays running, the GPU clock is 210 Mhz, but the memory clock is 2375 Mhz and the video clock is 1215 Mhz. Those clocks appear to drop when the display is off as well which I’m guessing is what contributes to the power savings.

Any thoughts on what I could check here to reduce idle power usage further? Is this just expected usage in this configuration?

I don’t have any suggestions for troubleshooting as I’m on Linux, but if that 75-80W is just for the GPU, that seems extremely high. My entire computer idles at less than that with a 3090 and 3 displays connected (1 HDMI 2.1, 2 DisplayPort). The GPU clock is 210 MHz and the memory clock is 810 MHz.

One thing that might be worth checking is if PCI-E ASPM is enabled for the card. If I remember correctly, Windows calls it “link state power-management”. I read recently this was the cause of high idle power usage for Intel Arc GPUs. Not sure if it affects Nvidia as well. (On my system, ASPM is enabled by default for the 3090, though that’s on Linux.)

1 Like

It is, yeah, and that’s what has me a bit frustrated. I’m using hwinfo64 to monitor it.

I’ve seen this setting before so I checked it. Right now it’s set to “Moderate Power Savings” with the other two options being “Off” and “Maximum Power Savings.” I tried both settings but didn’t see any difference in GPU power usage. There might be a BIOS setting for this as well so I may double check there but I think it’s likely on “auto” or similar.

Might have something to do with your PC power management settings? Eg: high performance enabled vs balanced or power saver instead.

My Windows power setting is on Balanced. Changing it to “Best power efficiency” doesn’t change the behavior of the GPU.

Are you sure that the wattage reading is accurate? Have you tried to measure the system power consumption at the wall outlet?

You’ve got a multimonitor setup, welcome to noVideo bugs from Kepler era, aka nVidia will tell you it’s working as it’s supposed to although they know that its a long running issue. From what I could figure out, devs just couldn’t figure a fix so they put it under “expected behavior”.


Fairly certain. My UPS reports the usage with the displays off at about 150W, and then it jumps to 350W just by waking them up. I don’t suspect that hwinfo64 is 100% accurate, but the other thing I notice is that when I come back to the system after the displays are turned off, the GPU fans are not spinning. Shortly after I wake up the displays, the fans start spinning to keep the GPU cool. So that leads me to believe it is fairly accurate.

Well, I’ve found many posts out there with claims about getting Nvidia power usage down to under 20W even in multi display configurations. Do AMD GPUs not exhibit this behavior?

It’s a driver bug for nVidia GPU’s, you can try to DDU and use just one monitor to check if it’s the issue you’re having, but generally, the issue is that GPU sticks at 3D clocks when you have multiple monitors.

My 3080 draws 32W on idle with dual-4k-monitors on in Linux as measured by nvidia-smi and GUI tools. It “spikes” to 38W when scrolling this page in a browser :slight_smile:

I know this is not really an apples to apples comparison, but a data point nonetheless.

My hardware includes a X570 mobo and a 5900X.

Can you please elaborate on the remaining hw in your setup?

1 Like

I’m running a 7900X on X670. Fresh install as of this month. Nothing really out of the ordinary there, 2 sticks of RAM, an M.2 SSD.

I’ve heard that you can save power by running your additional monitors off on board graphics instead of the graphics card, this was in the context of an Intel processor but might be worth a shot.

Running a plug power meter on my system (i5 10500, rtx 2070 super) it’s around 20w extra power consumption for each monitor that is on… So 54w idle with screens on power save, 75w with one screen on, 90w with both screens on… So your numbers for three displays aren’t completely out there.

Just to follow up on my own post, it certainly does save a bit of power plugging my secondary monitor (4k 28") into the motherboard graphics, power consumption barely increases… unfortunately it is noticeably more laggy scrolling around webpages and moving windows than it is when plugged into the Nvidia.

I’ve now noticed you have a 5900x so unfortunately not an option in any case, perhaps interesting to see how it works on the new 7xxx series with onboard graphics.

1 Like

You could also try to disable hardware accelerated GPU scheduling and see whether that lowers the power consumption.

You mentioned the amount of displays but not their resolution, this matters greatly in terms of 1080p vs 4K. GDDR6X is known to be a power hog even at idle, but like maybe an extra 20w so I would say you definitely have some other kind of issue going on.

You’re also on a consumer platform that has been on the market less than a month, it is entirely possible that your problem has nothing to do with the GPU itself. Is your BIOS current? Have you tried putting some of the displays on the iGPU?

1 Like

Check out this thread about high GPU power consumption on AM5 system: Ryzen 7000 linux idle power, what's normal? High CPU PPT

There is a very subtle difference between CPU and GPU… :wink:

“C”, “G” - almost the same :slight_smile:

No, seriously. The thread is about high power consumption observed by the AMDGPU driver. I took that to mean the integrated GPU unit, but I think the same driver covers the AMD graphics cards.

I see a potential connection based on the same CPU platform used (AM5). Maybe a UEFI/BIOS issue that causes high GPU power consumption when idle?

This did occur to me, but I’m afraid of it causing other more different issues that I’d rather not deal with. I’ll keep it in mind as something to try.

True enough. My primary display is a 4K/120Hz panel, the second is a 1440p/120Hz panel, and the third is a 1080p/144Hz panel. So all different.

I experimented with the different panels and discovered that some combinations of them allow the GPU to lower power down more to where it should be. With all 3 connected, it jumps back to 75-80W.

But wait! If I switch the 1080p display to HDMI instead of DisplayPort, now things are better! I’m seeing 30W GPU usage during idle just by swapping the connector type. On this display, it now has to run at 60Hz instead of 120 or 144 (lowering the Hz on DP did not change the power consumption) but since it’s a side display that shouldn’t be too much of an issue.

I guess the lesson here is that I need newer displays rather than continuing to run old ones? lol

So some further experiments here. I did not notice at the time, but my 1440p display dropped itself back to 60Hz while changing things around earlier. As soon as I set it back to 120Hz, the GPU power usage went back to being locked at 80W.

Some further tinkering later (including putting the 1080p panel back on DP), and it appears that the actual trigger here is whenever any non-primary display is set to a refresh rate that matches or exceeds the primary display’s refresh rate. If my 4K panel is 120Hz, then the two other monitors can be 100Hz and the GPU will sit at 30W idle, but as soon as either one goes to 120 or 144, then the power usage is cranked. If I set the 4K panel to 144Hz, then I can set the secondary panels to 120Hz each and the GPU power usage will drop to 30W, but setting either secondary panel to 144 will cause the power usage to jump.

Really wild behavior.