[SOLVED] Freesync/G-Sync compatible on Nvidia disengages with multiple GPUs in Ubuntu 20.04.2 and KDE

Update: It was an Xorg AutoAddGPU and AutoBindGPU issue. You have to manually specify the PCI Bus ID to ensure no phantom X screens are spawned. This is apparently required in the newer versions but not required in older versions.


Relatively new bug I found with this specific setup… So something in Ubuntu 20.04.2 (New Xserver, New KDE) interacting with the recent revisions of the Nvidia driver cause it to completely drop Freesync (G-sync Compatible) support when more than one GPU is installed.

Formerly, on 18.04.5, having additional X screens on “connected displays” and reverting back did not affect Freesync. Now in 20.04.2, with 2 GPUs installed, if you attempt to enable a second display, something is causing a “phantom” X screen number 256. (Good luck googling anyone else that has this issue because I believe this has never been reported as an issue)

Because there is a phantom X screen 256, the nvidia-settings app freaks out and cannot display the “enable G-sync on compatible displays” under the OpenGL menu anymore.

Is this Nvidia’s fault in nvidia-settings? or KDE’s fault for reserving phantom screens and never letting them go? I would love for more people to test this with 20.04.2 because this is pretty broken as an inactive 2nd GPU should still allow Freesync/G-sync to work.

It is certainly NOT the display manager. Tried both lightdm and sddm to the same effect. Wiped .local/share/kscreen to no effect neither. The second GPU isn’t even listed in xorg.conf and this was fine in 18.04.5.

For context, I’m using a 1080 Ti, but then also using a 1660 Ti in the same machine for improved NVENC. NVENC does work with no X screens attached as it’s a compute based workload. It is not a cross-generational issue since the 440.48.02 driver on 18.04.5 allows Freesync with a 1060 and a 1660 Ti installed since the older version of something isn’t using a phantom X screen.

For picture proof:

18.04.5, KDE 5.12.9, Nvidia 440.48.02



20.04.2, KDE 5.18.5, Nvidia 455.50.10



Both use the same xorg.conf:

# nvidia-xconfig: X configuration file generated by nvidia-xconfig
# nvidia-xconfig:  version 418.52.05

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0"
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"
EndSection

Section "ServerFlags"
    Option         "MaxClients" "512"
EndSection

Section "Files"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Keyboard0"
    Driver         "kbd"
EndSection

Section "Monitor"
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "Unknown"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    Option         "Coolbits" "28"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    SubSection     "Display"
        Depth       24
        Option     "metamodes" "nvidia-auto-select +0+0 {AllowGSYNCCompatible=On}"
    EndSubSection
EndSection

But this extra bit was present in the Xorg logs on 20.04.2:

Virtual screen size determined to be 640 x 480

…with nothing connected to the secondary GPU.

So I’ve had nothing but trouble with freesync working on my AMD/LG setup on Linux. Works just fine on Windows, but I could never get it working on Linux.

I think Freesync might not be fully there yet.

Single GPU Freesync works great on Nvidia and KDE. Multi GPU is where this problem creeps in with the newer versions.

Guess what, it was an Xorg auto add GPU/auto bind GPU issue.

Updated xorg.conf specifically for the 20.04.2 system:

# nvidia-xconfig: X configuration file generated by nvidia-xconfig
# nvidia-xconfig:  version 418.52.05

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0"
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"
    Option         "AutoAddGPU" "false"
    Option         "AutoBindGPU" "false"
EndSection

Section "ServerFlags"
    Option         "MaxClients" "512"
EndSection

Section "Files"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Keyboard0"
    Driver         "kbd"
EndSection

Section "Monitor"
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "Unknown"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    BusID          "PCI:101:0:0"
    VendorName     "NVIDIA Corporation"
    Option         "Coolbits" "28"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    SubSection     "Display"
        Depth       24
        Option     "metamodes" "nvidia-auto-select +0+0 {AllowGSYNCCompatible=On}"
    EndSubSection
EndSection

I can’t get Gsync to work with more than one GPU connected.
I’ve tried what you said above with no success.
If I connect one GPU, Gsync just works without any configuration.
If I connect more than one, it doesn’t.

My setup:
1 Monitor Gsync Compatible
1 3070 Ti - For Gaming - Connected to the monitor
2 3060 Ti - For Mining

# nvidia-settings: X configuration file generated by nvidia-settings
# nvidia-settings:  version 520.56.06

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0" 0 0
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"
    Option         "Xinerama" "0"
EndSection

Section "Files"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Keyboard0"
    Driver         "kbd"
EndSection

Section "Monitor"
    # HorizSync source: edid, VertRefresh source: edid
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "AUS VG27AQL1A"
    HorizSync       249.0 - 249.0
    VertRefresh     48.0 - 170.0
    Option         "DPMS"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "NVIDIA GeForce RTX 3070 Ti"
    Option         "Coolbits" "31"
    BusID          "PCI:10:0:0"
    Option         "PrimaryGPU" "yes"
EndSection

Section "Device"
    Identifier     "Device1"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    Option         "Coolbits" "31"
    BusID          "PCI:6:0:0"
EndSection

Section "Device"
    Identifier     "Device2"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    Option         "Coolbits" "31"
    BusID          "PCI:7:0:0"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "Stereo" "0"
    Option         "nvidiaXineramaInfoOrder" "DFP-3"
    Option         "metamodes" "2560x1440_170 +0+0"
    Option         "SLI" "Off"
    Option         "MultiGPU" "Off"
    Option         "BaseMosaic" "off"
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection

You must disable the GPU you’re using for mining in X11 by making sure it’s not generating an X screen. Make sure no extra X screens are generated.

Thanks for answering. But how do I disable it?

Turn off auto add GPU and auto bind GPU in your xorg.conf.

I already did it and still don’t work. This is the updated xorg.conf:

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0"
    Option         "AutoAddGPU" "false"
    Option         "AutoBindGPU" "false"
EndSection

Section "Monitor"
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "Unknown"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    BusID          "PCI:10:0:0"
    VendorName     "NVIDIA Corporation"
    Option         "Coolbits" "31"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection

Last thing I can consider is if you’re running Xwayland or turned on compositing. You have to run Kwin X11 and have all compositing off.

I’m on Manjaro KDE. I didn’t change anything. I’ll check those later and let you know. Thanks for trying to help me.

Compositor is disabled and I’m not running Wayland.

Downgrade your drivers. Could be it broke with a recent driver update to fix black screen issues with the 30 series cards.

Downgraded to 470 drivers and no success.

Looks like X screen are being created to the 3060Tis… Could it be the issue?

Something regarding preventing those GPUs from creating Xscreens seems to be broken. I tested on Ubuntu, so it’s different on Manjaro. There may be additional Xorg.conf files like in xorg.conf.d that might be interfering.

I just wanted to say thanks for posting this, I have KDE Plasma 5.24.7 and Ubuntu 22.04 and my system has two GPUs. The second GPU had an unnecessary Xorg running on it and I was unable to turn on the visual indicator for GSYNC under the OpenGL settings (though I suspect GSYNC was still on), but after setting the auto add/bind gpu to false, I can now show the GSYNC indicator and verify GSYNC is working properly and the second GPU has no Xorg program running on it. So, still relevant, thanks!

1 Like