SEE POST 7. You need to modify your EDID.
Recently I reinstalled my 1080 Ti with a 120mm AIO (this is not the issue) and on my BenQ EX2510, Low Framerate Compensation appeared to be working. The screen didn’t blank out on low frame rates and all was fine…
That was until I started to play with resolution settings in the latest version of Stardew Valley.
Setting 1920x1080 fullscreen actually made the monitor go to 1280x1024. Setting it to 720p fixed it, then 1080 again fixed it yet again. Unfortunately Stardew refuses to save fullscreen on my GOG copy for Linux, always reverting to Windowed Borderless.
Then it gets weirder. I realized I had to re-apply a fix to xorg.conf so a single Xscreen was present and it got even weirder. When I went to 720p, it was a monitor resolution of 720p but an Xscreen resolution of 1080p.
Even worse, this somehow turned OFF LFC on that entire GPU.
I double checked and switched the enabled GPU from the 1080 Ti to the 1660 Ti, and the 1660 Ti had no issues whatsoever.
I deleted Kscreen settings, switched from LightDM back to SDDM, the 1080 Ti now refuses to work with LFC.
Is this an architecture issue (Pascal vs Turing) or a XDG/Randr/X11 issue? (since changing resolutions in Stardew caused the issue)
Is it possible I was not using Freesync at all due to having 2 Xscreens when initially reinstalling the 1080 Ti and that masked the issues with the 1080 Ti since it was being sent fixed refresh?
The biggest thing I would like to learn from this is did Nvidia reserve LFC to Turing and above for the Linux drivers? That would suck for 1080 Ti owners. The Windows driver does driver level LFC, but apparently that didn’t come till later for the Linux driver (if at all, cause I’m raising my doubts)
TL;DR: Did I do anything wrong here? Is there a more widely documented case where LFC doesn’t work on Pascal but does on Turing?