Hi, I’m currently thinking about buying a AMD Radeon Pro WX 3200 as a simple display gpu for all my monitors (3x 165Hz WQHD) and a second gpu (nvidia or amd) that will handle rendering or the gpu passthrough in combination with looking glass towards my first gpu (so i don’t have to switch inputs anytime).
(1) Is something like this even possible?
(2) Is this a good host card
(3) There are multiple entry class Quadros for arround 200$, - are there better for my case) ?
(4) Is the host gpu even able to display at this high refresh rate for 3 WQHD?
(5) How can you render on one gpu and display on a different gpu on linux (no virtualisation)
(6) Wayland is better for this, right?
1: Yes, with caveats. Your guest gpu will need either EDID adapters or a connection into your monitors. If you game on this, connect into your monitors.
2: If it supports what you want in the specifications, then yes. I would suggest grabbing it used off Ebay, much cheaper. Same as the Quadros. A $250 Quadro will run about $30 on Ebay for a K620 for example.
3: You want a host GPU that can handle the amount of displays and the refresh rates you want. That’s all that matters. Avoid too low end of cards as they can’t drive the displays well.
4: The correct choice of host GPU will work fine. Note that Looking Glass has overhead above 1920x1080. It is noticeable at 4K resolutions, about 10-15 FPS in my configuration.
5: No idea here. I know it can be done, but I haven’t bothered. I was looking at it when I was still using a Tesla to test this stuff out.
6: No idea. Never use Wayland.
Looking Glass also has some input lag since it uses Spice to grab keystrokes/mouse input. It works well, but direct USB control to the guest VM is a much nicer experience. I have my setup so I can use Looking Glass, or direct connect both my keyboard/mouse/4K screen via KVM. Note that I only pass one screen via Looking Glass. I’m unsure if multi-display Looking Glass is supported yet.
You mean, something that will act like a video capture card? I try to make the hardware work with the software alone. No need to get more cables, adapters and other stuff… If i begin with that, i end up buying everything twice (in my opinion).
Thanks for the advise. I keep an eye out.
I tryd to ask this somewhere else too, but got no answer to it. Some people claimed that the resolution and the refresh rate would go hand in hand on every card [ 4k @60Hz, 2k @120Hz and Full HD @240Hz] but they couldn’t tell me a reason why exactly with math
Highest Pixel Output on one device with 60Hz: 7680x4320 = 1.990.656.000
My monitor Configuration:
Resolution: WQHD (2560x1440)
Refresh Rate: 165Hz
3 x (2560 x 1440) x 165 = 1.824.768.000
Which is less then 1.990.656.000 of max bandwith.
Does that mean i have this bandwidth always from this card as a max value and change it with the factor of the refresh rate to my liking? Because if the card doesn’t show that there are higher refresh rate possible at lower resolution, it still is?
For WQHD it should be even better for me, i guess.
I still hope, Steam will provide us with their flawless new version of Steam OS and proton so i maybe don’t have to do this at all.
No. Windows (running on the VM) will not render anything on the passthrough GPU unless the GPU output is connected to something that looks like a monitor to the GPU. This means that the GPU must have something plugged into it. If you have a monitor with a spare input you can hook it to the GPU (most monitors will work fine even if that input is not selected). You can also get a dummy plug and plug it to the GPU.
You can’t just do math to figure this out. The specifications are provided by the manufacturers. Don’t guess at stuff, verify. I have a Tesla K80 sitting on a shelf collecting dust because of guessing…
Please don’t repeat this nonsense. Spice isn’t used to grab anything, nor does it introduce appreciable latency. Early versions of LG have had issues with processing input but B3 introduced raw input, and B4 fixed a ton of bugs here. Next release (B5) fixes a ton of seamless integration issues.
A lot of effort has gone into making sure the code path for spice input is as short and efficient as possible. After factoring in the USB stack needed for usb pass-through in the guest, LG’s spice implementation is on par with passthrough devices, and in some instances actually has lower latency then a passthrough USB hub & mouse.
Looking Glass uses Spice. Spice has some input lag vs USB passthrough. I know this because I use it every single day. Some games can handle the Spice passthrough fine, some do not and you notice little jitters in your inputs. For regular applications, it’s fine. Just specific titles and may not even have anything to do with LG or Spice.
Anything as complicated as virtualizing an entire box and then trying to establish low latency performance on a generic Kernel is going to take time to perfect.
The mouse issues mentioned in the FAQ are resolvable, that’s the point of the information there. Nowhere does it say it’s to be expected or normal.
What version of LG are you using?
Are you on Wayland or Xorg?
Did you add and install the drivers for virtio-mouse?
Did you turn on input:rawMouse if you’re running a recent enough version of LG?
Do you capture the mouse when you use these more sensitive games?
If you have latency with spice then you have something misconfigured or your using a version of LG that still has some flaws that were fixed just before the B4 release, and more issues fixed pending the B5 release.
Little jitters are not latency, honestly sounds like a scaling/rounding bug if you’re not running a 1:1 resolution with your guest (which btw capture mode bypasses this scaling).
The 5th section is about disabling spice… If you are referring to the sections on mouse jumpy/lag, as I mentioned earlier:
This is not normal behaviour, if you’re having these issues then it’s limited to you or your setup and you should not be telling people that “Spice has input lag”, which is the point I am trying to make.
If you join us on the LG Discord we can try to help you resolve this if you care to.
Why do I need Spice if I don’t want a Spice display device?
You don’t need Display Spice enabled. Looking Glass has a Spice client built in to provide some conveniences, but you can disable it with the “-s” argument.
Without Spice, Looking Glass cannot send mouse/keyboard input to the guest and clipboard synchronization is disabled.
Spice is used to control the VM. It is a client/server system and will introduce latency that you will not see with a direct VFIO USB controller like I have. I am not using USB passthrough, I have a PCI USB host controller passed through to the VM.
That is his point. Spice is not a bottleneck anymore out of the box unless you are using an older version before the fix was introduced. IF you are still having issues, then there is something wrong with the configuration of the system. It could also be that there is a corner case, but ultimately, I would not argue with the guy that made and maintains the program.
I think this point is still valid.
FooLKiller is claiming that spice is an issue and introduces latency. gnif is stating that spice should not be an issue anymore, not that it is not being used. He asks that people not spread false information. They are both talking around each other really and it has turned into an off topic discussion anyway. OP’s post has been answered. The rest of this should probably be taken offline if there is a legitimate issue with the current LG.