Is selling my 4090FE and switching to radeon strictly for linux support a bad idea?

Tldr;
If I go and pick up a 7900xtx will my linux desktop experience be sunshine and rainbows with triple 4k monitors requiring different refresh rates (wayland)? Is idle power consumption with multi-monitor setups still horrible?

Longer Backstory:

I’m not a big time gamer anymore, but when I do game I crank every setting to max. I’m old now and can afford nice things but I don’t like to be wasteful. I’ve administered linux servers for decades and am very comfortable on the cli. I’ve also tried switching to linux as my primary desktop every few years since around 2000 and have never succeeded for long.

I run 3x 32” 4k monitors, with the center being a 32” 240hz oled with VRR support. Multiple monitors has been the literal bane of my linux existence, it’s always the thing that breaks the experience entirely. I have waited literally decades for something like wayland to come and make my linux desktop dream come true.

Today I installed Nobara linux followed by opensuse tumbleweed (with nvidia 550/555 proprietary drivers) to take my annual linux desktop spin. It didn’t go well. At all. Basic desktop functionality was still crippled and extremely buggy on xorg and wayland with kde 6.1 – I’m sure this isn’t news to anyone. I can go into details if anyone cares but I’ll spare everyone for now since they’re likely well known.

I’m looking for brutally honest feedback from others who have direct experience with multi-monitor setups with amd gpu’s. What works, what doesn’t? I need VRR support, fractional scaling, and the ability to run different refresh rates across multiple monitors so wayland is it.

I’ve seen conflicting reports about idle power usage for multi-monitor high refresh rate setups with the 7900xtx – Idle power usage of 80-100watts isn’t going to fly at $54c/kWh here.

I considered picking up a cheaper amd gpu to test with, but spending $400 on a “cheap” gpu that can’t game at all at 4k seems like poor value compared to just going all in on a 7900xtx.

Thanks!

2 Likes

Works out of the box on Debian with LXDE… not pretty, but my God does it run.

I am not surprised you had issues with Nobara and OpenSuse Tumbleweed; the reason you had problems with Nobara Linux (I never heard of it; I assume it is based on OpenSuse) is that Nividia graphic cards are going to have lots of issues with Wayland for the next ten years or more, the problem with KDE 6.1 is by default use Wayland, not Xorg I had the same issues when I tried to use Ubuntu 24.04 so I switch to Kubuntu. The developers of Kubuntu decided to stay with KDE 5.7 until the mess caused by every Linux developer switching to Wayland instead of Xorg is mostly fixed.

My only advice is to try Arch or Kubuntu and see if they work for you. I have never owned an AMD graphic card, so I can’t give any advice about them; the most advanced features or behaviors for graphic card’s are going to be available on Arch first (not on anything based on Arch)

6900xt here, 3440x1440@120 and 2560x1440@165 90⁰ rotated

for things the oh-so-mighty benevolent Wayland gods have sought fit for us mere plebs to do on our own computers works perfectly for this setup and you don’t get things like tearing on rotated screens

animations for the most part on kwin are smoother and more well presented, last time I used gnome on Wayland the cursor would lag though I hear that has been fixed

so long as you be a good crab and play in your little bucket and don’t try to do evil and disgusting things like share your own screen, or expect windows to remember where they were last placed, wayland is really good at esoteric display setup

2 Likes

Ahhh, fellow Californian?

So xorg should not be buggy on nvidia, but Wayland is highly expected to be buggy on nvidia. They’ve made some attempts to improve this, but they’re still… not there yet.

I have a 6900xt, which is probably a fairly decent analogue to your intended configuration. I’m pushing 3840x1600 and 1080p on my desktop right now. Different DPI, different refresh rates. Using sway (wayland based tiling window manager) and as far as display management goes, it’s great. But not everything is perfect.

Few things to know:

If you’re using varying resolutions, scaling fractions and refresh rates per monitor, you need to set your primary monitor as the one you intend to game on. For example, let’s say you have a 1440p 144hz monitor, a 4k 90hz and a 1080p 240hz monitor. When you launch a game on Linux, it will almost always use xwayland to push pixels, when running in wayland. This has it’s limitations. xwayland is good, don’t get me wrong, but you have to be aware of what you’re doing. xwayland will read your primary display to set the randr characteristics of the virtual xwayland display. This basically means that if you want the scaling and max refresh rate to be correct, you need to set your primary display to be the one you’re planning to game on.

I can look when I get home at my idle power consumption, I’m assuming you’re reading that from nvidia-smi, and not the wall? IIRC, my idle consumption is somewhere in the range of 5-10w. I don’t think you’ll be able to do much better than that.

As for GPU drivers, AMD is pretty much “what you get is what you get”. It’s all built into the kernel, and the userland packages for this are usually installed by default. That means, though, that if you’re in a situation where you have problems, you are more than likely SOL.

AMD on wayland is ahem the way it’s meant to be played. :troll:

If you’re on Linux, AMD + Wayland is going to be a FAR better experience than Nvidia and anything. That’s just Nvidia. They’re… not friendly to Linux. Not friendly to Windows, for that matter either.

AMD puts large efforts into having working Linux drivers on launch, or shortly after. This can sometimes take time to filter down and actually land in the package manager for your distro of choice, but for the most part, the 7900xtx has been out long enough that you’re in good shape if you decide to install one today, regardless of the distro (as long as it’s the latest version of said distro)

Ohhh! Almost forgot VRR. AMD only supports FreeSync. This is one of those things I’m not quite as up to speed on. My larger display supports freesync and seems to be working properly. AMD has support for VRR in the drivers, and my window manager has a adaptive_sync on directive I can issue to a specific display, which I’ve enabled. So, all in all, I believe it’s supported and works. I can run a specific freesync test tonight and let you know if you’re interested.


Welp, that turned out to be a lot longer than I thought it’d be. Hope it helps!

3 Likes

I would stick with Nvidia, if you were going to swap it made more sense in the past. Now we have decent Wayland support and explicit sync. The only thing that AMD has on Nvidia currently is multi-monitor VRR. I am getting around this currently with an alias to disable my other monitors when I game.

You should be using the 555 driver also. I am having a really good experience with Arch based distros on the latest Nvidia driver with Plasma.

1 Like

It’s also worth mentioning that it seems like Arch tends to be the only distro that actually provides a (lasting) stable experience for relatively new features, so distro choice may be a factor here.

3 Likes

I switched directly from Windows to Arch/Plasma/Wayland, and it’s been super stable. I am very happy with the bleeding-edge features and the AUR.

I can’t edit my last reply, so I will just say here that I think multi-monitor VRR will come soon for Nvidia as well.

Yes, my 3080 draws 100w idling with 2x 1440p 144hz and 1x 1440p 240hz. I am not sure if AMD is much better though.

1 Like

Wait wait wait, is that measured via nvidia-smi or whole-pc draw?

Because my 6900xt pulls ~10w at idle with 1x 3840x1600@75, 1x 1920x1080@60. Granted, I’m pushing much lower bandwidth, but not 10% the bandwidth you are.

1 Like

Stick with nvidia imo, but try Fedora. I dont know why anyone recommends anything else for gaming. You have to enable rpmfusion for the nvidia drivers, but other than that it is straight forward. I daily drive all AMD but I’ll be honest the experience wuth my 2070 was no worse, probably better.

Im not sure what issues you had on xorg that seems odd, it should be fine? I’m not sure if nvidia works on wayland yet I think you need the 555 beta drivers but I dont really pay attention, IIRC there were a lot of issues with the post 535 drivers so maybe thats what you were seeing, try either the beta or the older drivers. Also anyone actually still using nvidia correct me if I’m wrong.

You answered your own question.

Fedora’s package availability leaves a lot to be desired, and continued use of it is feeding the eldritch horror that is red hat.

4 Likes

nvidia-smi :sob:

Yeah, from my experience I highly recommend CachyOS or EndeavourOS for gaming. The AUR is super nice for someone coming from Windows, and tons of resources out there for anything Arch.

1 Like

Damn, that’s wild.

Well, I’m glad I didn’t go 3080 then, it was what I originally wanted for my build, but stock…

So looks like the annual cost of letting that GPU idle at the criminal electric rates is ~160 USD, assuming 8 hours a day power-on time. Not a huge cost, but definitely adds up, and I’m personally averse to paying the utility that much to run a single GPU.

2 Likes

For my 3080, running 8 hours per day at 100w, costs me $40 per year.

That seems a lot better :rofl:

So many great replies, thanks everyone!

@Shadowbane - Nobara is based on Fedora 40, it is a niche distro developed by a Red Hat engineer and developer of Proton fork Proton-GE who goes by the handle GloriousEggroll. It is a very fast way to spin up a current fedora distro that includes ootb nvidia drivers and the latest kernel built by someone who knows what they are doing when it comes to the linux desktop when gaming is a large priority. It’s not a distro I would personally run long term as I don’t need all of the tweaks but it’s a very fast way to get an extremely current linux environment for testing.

@SgtAwesomesauce - Thanks for the detailed info, extremely helpful, exactly what I was hoping for! I just went to microcenter and grabbed a 7900xtx based on your reply. I figure at this point I just have to see for myself as there is still plenty of division on this topic. I suspect idle power usage will be atrocious so it should be interesting…hopefully I’m pleasantly surprised! :rofl:

@packetauditor - Unfortunately my experience has not mirrored yours. Are you running multiple monitors at different refresh rates as well as VRR? Which DE? I also tried the nvidia 555 driver and experienced flickering and overall buggy behavior on wayland. Xorg – I suspect due to the refresh rate variance between monitors instantly became very laggy and strange on 555 with kde 6.1.3

@trezamere - Nobara is fedora 40, so I’ve already tried fedora (with nvidia 555 and kernel 6.10.2). Are you running multiple monitors, vrr and different refresh rates amongst the monitors? (I only have vrr support on my primary monitor, the side monitors are 60hz) On nvidia 555 with kde 6.1.3 in an xorg session the entire desktop becomes noticeably laggy, resizing windows is slow and strange etc with my setup. Wayland that all disappears but flickering and other unstable behavior started in its place.

I have played this game plenty of times in recent memory with Arch as well - I am distro agnostic, but in general when it comes to nvidia I understand you generally want to be close to the bleeding edge. I do not believe arch will have a material difference in performance or reliability compared to nobara (fedora 40) which is on nvidia a 555 and kernel 6.10.2 or opensuse tumbleweed which is also on nvidia 555 and some extremely recent kernel. The internet is littered with complaints about the 560 nvidia beta.

I just pulled the trigger on the 7900xtx and am installing it right now – I’ll report back how it goes as another point of reference. It’s clear everyone’s experience based on unique hardware configs and personal expectations will be different.

3 Likes

Just as another point of reference according to HWinfo64 on windows my 4090 is pulling 22w on the desktop with a bunch of standard desktop applications running in the background. All monitors are 4k, main monitor set to 240Hz with VRR enabled, side monitors are only 60Hz so that is likely helping.

1 Like

Multiple monitors at different refresh rates and VRR, yes. Plasma 6.1.3 Wayland.

1 Like