[UPDATE] [FIX USING MODIFIED EDID] Really weird issue with Freesync/G-Sync Compatible LFC on Linux

SEE POST 7. You need to modify your EDID.


Recently I reinstalled my 1080 Ti with a 120mm AIO (this is not the issue) and on my BenQ EX2510, Low Framerate Compensation appeared to be working. The screen didn’t blank out on low frame rates and all was fine…

That was until I started to play with resolution settings in the latest version of Stardew Valley.

Setting 1920x1080 fullscreen actually made the monitor go to 1280x1024. Setting it to 720p fixed it, then 1080 again fixed it yet again. Unfortunately Stardew refuses to save fullscreen on my GOG copy for Linux, always reverting to Windowed Borderless.

Then it gets weirder. I realized I had to re-apply a fix to xorg.conf so a single Xscreen was present and it got even weirder. When I went to 720p, it was a monitor resolution of 720p but an Xscreen resolution of 1080p.

Even worse, this somehow turned OFF LFC on that entire GPU.

I double checked and switched the enabled GPU from the 1080 Ti to the 1660 Ti, and the 1660 Ti had no issues whatsoever.

I deleted Kscreen settings, switched from LightDM back to SDDM, the 1080 Ti now refuses to work with LFC.

Is this an architecture issue (Pascal vs Turing) or a XDG/Randr/X11 issue? (since changing resolutions in Stardew caused the issue)

Is it possible I was not using Freesync at all due to having 2 Xscreens when initially reinstalling the 1080 Ti and that masked the issues with the 1080 Ti since it was being sent fixed refresh?

The biggest thing I would like to learn from this is did Nvidia reserve LFC to Turing and above for the Linux drivers? That would suck for 1080 Ti owners. The Windows driver does driver level LFC, but apparently that didn’t come till later for the Linux driver (if at all, cause I’m raising my doubts)

TL;DR: Did I do anything wrong here? Is there a more widely documented case where LFC doesn’t work on Pascal but does on Turing?

Really? Nobody can confirm this with a 20 series vs 10 series A/B test? I was seriously wondering why my 1660 Ti was “better” for FreeSync than my 1080 Ti.

On the 1080 Ti, if the frame rate rapidly dipped below 48fps, (like if a game temporarily hung) the screen blanks out. (but interestingly not if the framerate is consistently below 48fps) On the 1660 Ti, no matter what, anything below 48fps was properly displayed.

This would mark a discovery of LFC isn’t really LFC on Nvidia Linux drivers.

Test all APIs as well, OpenGL, Vulkan, OpenGL ES…

Sorry, I don’t have access to the hardware to test with. :confused:

Okay, I’ve consistently been able to reproduce the LFC screen blanking issue using the BenQ EX2510 on my 1080 Ti, but not on my 1660 Ti on Linux. Windows has absolutely no problem with this monitor as driver level LFC kicks in. Linux… You’d be lucky to have driver level LFC.

It ALWAYS happens when a frametime spike occurs in just the right way the LFC algorithm for Pascal freaks out and spits out garbage timings. (Only on Linux, mind you, latest 460 driver too)

In the same conditions, the 1660 Ti never encounters this issue.

This is the same result for both DP ports on my 1080 Ti STRIX so it is not a GPU power issue like when you push a RX 580 to 220W. (That is understandable, because the silicon behaves weird at those power levels) This was stock for that specific GPU clocks, temps under control with a 120mm liquid cooler, and identical locations in games where I consistently got the same screen blanking while the 1660 Ti didn’t.

Something about this isn’t right. It’s like they abandoned VESA adaptive sync LFC development on Pascal but continued on Turing and up. And the screen blanking issue only applies on the Linux drivers.

DO NOT UPDATE TO STABLE DRIVER 470.74.

Apparently this completely breaks G-Sync Compatible.

In the thread, someone with a 1050 Ti reports the same issue as me when things are working, leading me to believe recent drivers have LFC changes that screw up timings on Pascal GPUs… and 470.74 just broke timings for all GPUs.

Reddit thread:

https://www.reddit.com/r/linux_gaming/comments/ps8us2/nvidia_47074_breaks_gsync/

From the thread, it only affects Freesync and not native G-sync modules.

WOW. Apparently this bug is not new. It’s been around for a while:

It didn’t affect my VX3211-4K-MHD because it doesn’t blank the screen during this incorrect LFC behavior. The EX2510 is sensitive to this behavior and has this problem.

This is an impossible to fix problem. (Those stuck on Pascal will have to upgrade or use VFIO)

So, if you’re on Pascal, you now have these things working against you:

  • Poor VKD3D performance
  • Unoptimized AMD FSR performance (Pascal purposely cripples FP16 performance)
  • Broken Freesync LFC
  • No DP DSC

Worst of all, to get an AMD card, you give up NVENC and NVFBC.

I am so mad right now.

1 Like

Fixed using different Freesync Ranges. (not a driver level fix, but a patch to the monitor EDID)

The official reason from the official Blur Buster himself: Firmware Fixes To Fix for VRR Blank Out Issues (Monitors Randomly Going Black For 2 Seconds) - Blur Busters Forums

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.