KDE on a 144hz with Nvidia GPU

I read that KDE does not do well with Nvidia GPU’s and that Linux has issues with 144hz monitors. I am setting up an Arch machine on my gaming PC and was curious if there is anything I need to do to get both working well (Nvidia with KDE and Arch with 144hz)

I use Kubuntu with a 20-series Nvidia card. It’s fine.

You definitely want to install the proprietary Nvidia drivers. The open-source one is called nouveau, and it’s bad.

If memory serves, in order to have the compositor run at 144 FPS, you have to edit ~/.config/kwinrc. Set MaxFPS=144, instead of 60 or whatever the default is. Change it back if that causes any weird issues, but it was fine on my system.

KDE’s Vsync implementation is not as good as Windows’. With Vsync on, I find that dragging windows around on the desktop feels noticeably worse. Using middle-mouse to autoscroll down a website slowly reveals a ton of microstutter, and who knows how many lag frames. However, in compositor options, you can just turn it off; there’s tearing, of course, but otherwise it’s silky smooth. There’s an option in the Nvidia control panel for G-sync/compatible, but I don’t think it does anything outside of 3D-accelerated fullscreen applications.

Are there any specific use-cases you’re wondering about?

1 Like

For anything after 700 series.

No there isn’t any specific use cases. Definitely going for the proprietary drivers since I want the best performance, thanks for telling me about the Vsync since I usually have it on even if it has a performance impact. I’m pretty much just asking any question that comes to mind that I can’t get an answer to looking it up since this is my first time installing Linux on a “gaming pc” and I want everything to work

No problem. I should mention that I don’t really play games in linux (dual-booted with win10), so I’m not sure if the microstutter applies to games, or if it’s a KDE compositor issue (since a fullscreen application should bypass the compositor). I’m almost positive you could run a game with Vsync on, but leave it off in compositor settings for desktop use if that’s the case, and get the best of both worlds.

I don’t play games on Linux either, I’m going to end up setting up VFIO later down the line but for right now Animal Crossing is all I need. Thank you however for the information, any info I can learn before going for the install is needed info.

So tbch why are you moving your windows at 144 hz

Selectively game at that resolution and frequency otherwise stick to 60 for normal viewing. Almost no content is optimized beyond 60 save gaming.

Wayland also doesn’t do well with it yet. I recommend you run x for now.

Kf5 and gnome 3 both run on Wayland but can be configured for X

I don’t honestly know much about monitors so I just assumed you were stuck at whatever HZ it is made for. I don’t plan on gaming in Linux (rather going VFIO route) so sticking to 60 on my Arch side be better?

1 Like

You can always run a lesser clock. That’s fine it’s going past that which may not go well or it may not display at all

See the clock of the signal isn’t the only thing. A 60 hz signal is always sampled at a rate of 120hz because that’s the Nyquist rate. So a 144 hz is technically sampled at 288. It’s updated every 144 … this prevents aliasing. Now the key here being if you have a 60 hz display the dsp would have to handle double whatever higher setting you set it to. Most monitors don’t have that kind of capability so when you send a signal higher it often says “display signal not found or resolution not supported”

If you set it to a lower it has no issue. A 144 hz panel’s dsp can sample at 120 because it’s less than 288 and you will see no issue setting it to 60 hz save maybe a bit of an alignment or scaling issue as a possibility

Bottom line TLDR anything lower is fine but a waste of the display … anything higher than the advertised freq usually doesn’t work …

Also while KDE is nice and so is Gnome… they are.more buggy than MATE and XFCE. IMHO… If it ain’t broke don’t fix it but the latter two are Rock solid stable and can be made beautiful

It’s mostly compositing that messes up KDE on Nvidia. Turn compositing fully off. You’ll have tearing but the experience will be more consistent.

Not sure what you mean here…the Nyquist sampling theorem has to do with sampling a continuous-time signal at a certain frequency. For video, a still frame doesn’t need to be “sampled” twice in order to display, just flipped into the scan-out buffer during vsync (or as soon as it’s ready, if vsync is off). It’s true that 144-Hz monitors support lower refresh rates, but unless a monitor has freesync/gsync, there’s a minimum refresh rate that the monitor will accept; that doesn’t match with the Nyquist theorem either.

I know better than to argue with you about signals or RF. :laughing: But video is a real-time scheduling problem, more than anything else.

Also, as for why anyone would want to drag windows around at 144 Hz…there’s no Nyquist rate for human visual perception of motion. Your eyes don’t put a LPF on the rods and cones, then sample at 120 Hz. Check out Blur Busters if you’re curious about this stuff. The TLDR is that objects move continuously, but sample-and-hold displays (i.e. LCDs) produce discrete “steps” of motion. If you try to track a moving object on an LCD, this “stepping” manifests as error (the object is ahead or behind of where it “should” be), resulting in blur on moving objects. Doubling the refresh cuts the mean blur by half.

It’s cool stuff, but a little off-topic for this thread. The point is, once you try a high-refresh monitor…you really can’t go back to 60 Hz!

yeah you are correct her but it applies to digital clocks as well.

Oh yes they do… LOL The optical/retinal system of the eye acts as a low-pass filter removing all frequencies above ∼60 cycles deg^{−1} , so it follows from the sampling theorem that the retinal image can be fully represented by sampling at a frequency of 120 cycles deg^{−1} , which is approximately that of the receptor mosaic. :stuck_out_tongue: we are limited in how much motion we can see. (Source: Digital Image Processing course) I suppose if you want to get really technical its actually the brain that does it but yes there is an LPF in our eyes :wink:

oh yeah 100 percent agree but you are forgetting… there is something called the Discrete Nyquist rate :wink:

What I was referring to is that we use sample the clock of the signal and use it to control almost every other signal in the monitor. Thats why you can accept lower but not higher more often than not. It was indeed an over simplifcation and off topic for the thread but I agree you cant go back to 60

The issue being if he is running linux and its simply not working right theres no issue in going down to 60 for a bit. No harm no foul you dont need above 60 for text and code LOL…

Yes … yes it does and its beautiful… but for stationary objects the blur is already so small it doesnt matter. So if he isnt using linux for gaming or content creation and instead is using it for code… well LOL… why does it matter what refresh rate its at?

Anyways @washthemhands87 if you cant get kde+wayland to do it… Try KDE+X and if both cant… just run a different DE

1 Like

You still have to sample twice the rate if you want to avoid aliasing. The thing is that 60Hz is way to slow to confuse eyes into thinking it is dynamic. So one frame coming too soon and one frame coming too late will create a inconsistency that triggers effects in your brain and stuff you can easily see like tearing.

This is why they started to make 100Hz CRTs back in the day and all dogs loved their new TVs :slight_smile:

For those who remember - sorry for triggering you PTSD with barking all the time :smiley:

And even if you want to empathize with the dogs - go to your local PC store and try the Valve Index head tracking on high refresh rate display. Turns out having a fast monitor with slow tracking is not a good idea :smiley:

i had a issue with kde and activated compositor because it goes to 60 HZ but when i turn it off 144 hz and gsync work