Use nvidia GPU for gaming vm and host without big performance loss?

Hi guys,

first off all: yes, i tried to search for this topic but didnt found anything special for it. So I hope you can help me.

Currently I’m running a Arch Linux Machine (i3 + X11) with an i7700K (iGPU) and a nvidia GTX 1080 for my windows gaming vm. The GTX is exclusive reserved for the VM via vfio. Long time I was very happy with this setup because it works without any problem.

But now there are more and more native Linux games and I want support the linux gaming community :wink: So I started to search a way to use the GTX 1080 for the Host and the VM. My main goal is: Zero (or so low as possible) performance loss.

I found many setups which example the dynamic binding of a gpu, e.g. https://arseniyshestakov.com/2016/03/31/how-to-pass-gpu-to-vm-and-back-without-x-restart/ and often primus/bumblebee is used for the GPU switching at the host. But at my research I found often some hints about performance lost with primus/bumblebee because the way how the “signal” is transferred seems not optimal.

I know looking glass but I found an user on youtube which use lg and tells about noticeable input lags with lg. So, it seems (at his current state) not perfect.

So I will ask you for help. Is there any way to achieve my goal of a lossless share? Thank you :slight_smile:

1 Like

It’s possible, but I’ve never managed to pull it off.

The issue is that if you want no performance loss, you’re going to need to restart X. to set your 1080 as the primary output and disable the iGPU.

Bumblebee/prime should be enough for most use cases though.

Looking Glass has no display lag. Desktop compositors do. I’ve seen frames arriving on the host system before they are rendered on the guest. The problem is that Gnome adds about 100ms of latency to everything because mutter needs a lot of work. KDE isn’t much better either.

As for Input lag, it’s minimal. 10ms maybe.

The other thing is that Looking Glass client isn’t built for Linux.


At the moment, I don’t think you can achieve lossless GPU sharing without an X restart or a full system reboot.

on the consumer side of Nvida and AMD SRIOV ( which is what you need) is not available. ( certain legal reasons prevent it hardware is capable. ) so for now a reboot is needed in order for you to use the 1080 ( simplest way after you remove the stub / blacklisting you did to the card.)

Thank you for your feedback. I think I will try it with the X restart. If I only need to change the xorg configuration and restart X to switch the gpu it can be tolerable.