60 fps comes from the -k option cant pin down why it cant get over 60
What desktop environment? Using a compositor? Can other applications run at over 60 FPS?
Could try a different DE.
arch / i3wm / compton - host is running at 165hz its clearly visible.
Inside the Guest Game-FPS go over 60.
Only guest -> lg client only shows 60 fps with 60 ups inside games but never above and the system is powerful enough to go beyond so it cant be the bottleneck. i7 8700k 1080ti and 8 gb for the guest.
Somewhere i am missing something
Using DXGI? Have you tried disabling compton temporarily? Does the GPU have the full X16 or at least x8 bandwith of PCIe 3.0? What resolutions are you running? Have you tried a different game too?
Always can try Teeworlds for testing as a game (is open-source/cross-platform and can display very high FPS).
Is anyone able to push [email protected]? For me on Alpha 10, I can’t even get 60fps on the desktop, let alone in games.
@Celmor yes using dxgi / disabled compton not the culprit and the gpu is 8x res is 1440p and shouldnt looking at the desktop give max fps for the lg fps counter?
I suppose [email protected] with uncompressed frames would saturate the bandwith quite a bit, now if you additionally game it’ll fight the game engine for bandwith. From what I’ve seen @gnif is working on an h264 encoder which should leverage the built-in one on the GPU which should improve things quite a bit in bandwith limited situations.
Good to hear. Thanks for the info. My host and guest GPUs are both running at x8. I wonder running at x16 would remedy the issue. Unfortunately, my CPU doesn’t have enough lanes to test…
That’s not how this works. The FPS is the locally display FPS so in client and on host, shouldn’t matter what is displayed. The difference with being on desktop between having gameplay displayed is that UPS should change, where it doesn’t need to send updates as often if there’s a (relatively) static image vs. a constantly changing image.
I have a 165Hz monitor and last time plugged it via DP on host and via HDMI into guests GPU which doesn’t allow above 60Hz though and overclocked to 165Hz. The Client displayed ~163FPS and sometimes above 60 UPS though I suppose I’m also bandwith/CPU/latency limited there.
I understand the difference between UPS and FPS and i can push ups=fps.
We have about the same setup my client never shows above 60 fps its annoying as i got used to a smooth mouse
But playing around with glxgears right now. Something is fishy gonna keep you all posted somewhere in this system something vsyncs to 60hz but its not the lg-client
thanks for your help @Celmor
edit: and found it i had one 60hz monitor on the host gpu and it dragged down everything to 60fps vsync inside the shell. Now i have to figure out if i can disable that .
For the best results you need to pass through an entire soundcard into your VM as QEMU’s emulated one is less than stellar. I hear something is in the works but yeah.
UAC was the cause of this
Yeah, I already know, that the emulated sound is glitchy, but don’t have a dedicated soundcard on hand right now. I was just confused, that starting looking glass triggered it
Might be a bandwith or CPU latency issue that triggered it, sound gets quickly worse if there’s a cpu load. Might be able to isolate the emulation threads of qemu to its own cores, isolated from the worker threads on the host via pinning to mitigate that.
You can try using the branch of QEMU that is patched with a better PA driver.
Thanks, will give it a try when I have time
Hi, I’m having issue with building the package on Fedora 27 (upgrade from 26).
I’m seeing a warning about the spice/protocol.h file (header).
Package libva was not found in the pkg-config search path. Perhaps you should add the directory containing `libva.pc' to the PKG_CONFIG_PATH environment variable Package 'libva', required by 'virtual:world', not found Package 'libva-glx', required by 'virtual:world', not found gcc -c -g -O3 -std=gnu99 -march=native -Wall -Werror -I./ -I../common -DDEBUG -DATOMIC_LOCKING -ffast-math -fdata-sections -ffunction-sections -Wfatal-errors -DBUILD_VERSION='"a10-34-ga02087e5e4"' -o .build/main.o main.c Package libva was not found in the pkg-config search path. Perhaps you should add the directory containing `libva.pc' to the PKG_CONFIG_PATH environment variable Package 'libva', required by 'virtual:world', not found Package 'libva-glx', required by 'virtual:world', not found gcc -c -g -O3 -std=gnu99 -march=native -Wall -Werror -I./ -I../common -DDEBUG -DATOMIC_LOCKING -ffast-math -fdata-sections -ffunction-sections -Wfatal-errors -DBUILD_VERSION='"a10-34-ga02087e5e4"' -o .build/lg-renderer.o lg-renderer.c Package libva was not found in the pkg-config search path. Perhaps you should add the directory containing `libva.pc' to the PKG_CONFIG_PATH environment variable Package 'libva', required by 'virtual:world', not found Package 'libva-glx', required by 'virtual:world', not found gcc -c -g -O3 -std=gnu99 -march=native -Wall -Werror -I./ -I../common -DDEBUG -DATOMIC_LOCKING -ffast-math -fdata-sections -ffunction-sections -Wfatal-errors -DBUILD_VERSION='"a10-34-ga02087e5e4"' -o .build/spice/spice.o spice/spice.c spice/spice.c:40:10: fatal error: spice/protocol.h: No such file or directory #include <spice/protocol.h> ^~~~~~~~~~~~~~~~~~ compilation terminated. make: *** [Makefile:33: .build/spice/spice.o] Error 1
As previously suggested I’ve used the following list to install pre-requisits as the one listed in the guikd has a few errors ATM:
sudo dnf install git SDL2-devel SDL2_ttf-devel openssl-devel spice-protocol fontconfig-devel libX11-devel gnu-free-mono-fonts ivshmem-tools libgle-devel.x86_64
Any help would be much appreciated. I can see this either been a revision mis-match or a Fedora change that’s causing the issue.
I may have just resolved this one myself by switching to the a10 tag. I’ll keep you posted.
I am a total noob when it comes to linux and virtualization, but after stumbling across this amazing project I had to give it a try. Over the last 2 days I have managed to install Arch using the Plasma desktop and have Looking Glass running with a Windows 10 VM. Everything seems to be running very well. Can’t say that I understand everything that I did, but I sure learned a lot on the way.
I have a couple of quick questions. The output of Looking Glass shows the following:
It says “Using decoder: NULL”. Should I be using a decoder and if so what would I need to install for that? Also I am not using Spice since I am plugging the passed-through graphics card into an extra input on my monitor. Is there any performance to be gained or lost by using spice instead?
As a note if anyone has problems with the compositor in KDE causing issues, unchecking “Allow applications to block compositing” will fix it.
That should mean that it’s transfering frames uncompressed/encoded so you should be getting low latency and best quality as long as you have the bandwith (RAM bandwith/PCIe bandwith for GPU). The encoder, like h264 is still work in progress currently and should help people which have only limited bandwith available.
Spice isn’t used for video but only for input, i.e. passing key presses and mouse movement the looking glass client receives to the guest OS.
Currently the GPU needs something plugged in anyway, either a monitor and you just switched input of monitor to host which is also connected to the monitor or a dummy plug or LG wouldn’t work.
If you want to use spice depends on how you want to control the guest OS, you can disable it if you assign USB peripherals to the VM and control it more directly, in which case stuff like macro buttons, RGB, DPI and stuff like that can be controlled by software in your VM.
The PS2 driver is still a bit buggy so you can experience a few issues, like stuttery or position mismatch, but gnif was trying to fix that. I personally prefer using a KVM switch which plugs my mouse and keyboard into host and into a passed through USB controller and switch the inputs using a physical button.
PS: Nice to see that LG inspired new people to do a VGA passthrough setup