Looking Glass - Triage

Thanks. That helps. I just remembered that i have a G-Sync Monitor. Now i need to look into what happens with that and an AMD Card that won’t hit it’s 144Hz Refresh Rate. I suddenly realize what all those iOS people mean when they are talking about “being locked in the echosystem” :wink:

Edit: While we’re at that topic, does G-Sync work with GPU Passthrough? It should from my understanding, just not with Looking Glass, Correct?

FS and GS do not work with LG, the only way to get that stuff to work IIRC is to pass through a GPU to a guest and then the guest GPU connects to the monitor and that stuff should work.

TBH I’m not an expert on this stuff. you should check out the one stop shop thread.

1 Like

Thanks, i’ll check that, but logically it makes sense.

i am trying to setup looking glass. from following the official instructions under their website’s “quickstart guide” this now happens when i attempt to start my VM.

error: internal error: qemu unexpectedly closed the monitor: 2018-09-01T05:38:06.979529Z qemu-system-x86_64: -object memory-backend-file,id=shmmem-shmem0,mem-path=/dev/shm/looking-glass,size=33554432,share=yes: can’t open backing store /dev/shm/looking-glass for guest RAM: Permission denied

i have tried my google-fu for the past few days and can find no solution.

@mathew2214

It is suggested that you create the shared memory file before starting the VM with the appropriate permissions for your system, this only needs to be done once at boot time, for example (this is a sample script only, do not use this without altering it for your requirements):

touch /dev/shm/looking-glass
chown user:kvm /dev/shm/looking-glass
chmod 660 /dev/shm/looking-glass

What does your ls -ld /dev/shm/looking-glass tell you?

And how are you starting qemu? Presumably as yourself?

@mathew2214

Have you read this guide?

Ive bound a hardware mouse to my VM, but the input lag is really noticeable (and annoying). When I switch to the native output from my graphics card there is no such lag, so there’s quite some overhead with Lookingglass itself. Can I expect that to improve in the future?

I gave using NvFBC a shot because I consider my setup barely useable, I know it’s not supported, but curiously enough the input lag is even worse using that unless I start a game where it becomes pretty acceptable(?)

-rw-rw---- 1 root root 0 Sep 1 12:10 /dev/shm/looking-glass

i am running qemu as root

thank you. that fixed it right up. now the VM starts but windows 10 gives IRQL NOT LESS OR EQUAL error. but i guess thats a question for a different thread and not related to looking-glass

Do not. I repeat DO NOT disclose which card you used it on. As per Nvidia’s license, Geoff can get in HUGE trouble if NvFBC is used with the “wrong” card. It could result in lawsuits on the project.

Input lag is unrelated to capture mode, remove the Tablet pointing device from your VM, LG doesn’t support absolute pointing devices.

For consumer cards It’s impossible to do with his code anyway :slightly_smiling_face:

As for the input lag, I’m not using the tablet pointing device or in fact any emulated devices. As I’ve said, I’m passing my HID hardware right through to the VM, not even using spice so LG shouldn’t really have an interaction with the mouse. I suspect the input lag is purely related to the delay in the writing and reading of the framebuffer.

I’m not sure if it applies to DXGI aswell but as per the Capture SDK forum, some 80ms+ input lag is supposedly to be expected due to triple buffering, which windows 10’s DWM pretty much enforces now with no way to turn it off so I’m wondering if it’s possible to perhaps (make LG) create a fake fullscreen app that allows interaction with the desktop while forcing vsync to be turned off. IF that is infact the culprit for the input lag.

Curious , for the host videocard, do I need to have the same amount of vram as the nvidia guest card?

No, the host card doesn’t do any rendering and only needs enough ram to render “pictures” at the desired resolution.

So it seems the fix is easier than I would have imagined. Just passing -o opengl:vsync=0 removed nearly all input lag.
I pretty much have the perfect VM now. So happy. :slightly_smiling_face:

Edit: I guess I cheered too early. While input lag is barely noticeable on the desktop, Game performance is significantly crippled by disabling vsync in LG for some reason. FPS tend to be around 100 (200 as reported by LG) and UPS around 45 at most.

And another edit: Setting -K 60 fixes the crippled performance. Yay.

That’s awesome. I Just bought an RX560 to try this out. Thanks for the quick reply!

DXGI Desktop Duplication is not part of the old windows capture API, the GPU driver itself handles the capture directly and provides the captured texture to windows to hand off to the application. As such there is no triple buffering, and in some instances we actually get the new frame before it is even sent to the physical screen.

FPS is accurate at 200, there is a hard limiter to prevent it exceeding 200FPS for those that run without vsync. -K changes this hard limit. If lowering this limit increases your UPS, the host CPU was starving for cycles running at this rate.

You’re problem was more likely due to a compositor on Linux. Lowering the hard FPS limit to your actual refresh rate is essentially introducing an artifical vsync without the benefits of vsync. It would be better to find out why vsync is introducing lag.

Turning the compositor off entirely doesn’t seem to help either, I’m afraid. I’m unsure what else the reason would be, glxgears for example runs just fine and glxinfo looks as it should.

So I have had Looking Glass working amazingly well for a few weeks now. This morning I awoke to a failure. After reinstalling most of the virtualization software as well as Looking glass I still can not get the Looking-Glass-client to hook into spice.

looking-glass-client -F
[I]               main.c:692  | run                            | Looking Glass (a69630764b)
[I]               main.c:693  | run                            | Locking Method: Atomic
[I]               main.c:686  | try_renderer                   | Using Renderer: OpenGL
[I]               main.c:775  | run                            | Using: OpenGL
[I]              spice.c:159  | spice_connect                  | Remote: 127.0.0.1:5900
[E]              spice.c:757  | spice_read                     | incomplete write
[E]              spice.c:591  | spice_connect_channel          | failed to read SpiceLinkHeader
[E]              spice.c:167  | spice_connect                  | connect main channel failed
[E]               main.c:868  | run                            | Failed to connect to spice server
[dustin@dustin-pc ~]$ 

The windows looking-glass server is running without issue, and the GPU is still passed through. The Linux client is the only thing not working currently.
Any ideas would be very helpful.

Spice hasn’t been working for me either as of late, and I’m not sure which package I updated that might be to blame. Try the evdev input method as a workaround perhaps.