And it looks really cool, because I would like to switch to Linux in about 2 months and I need to solve video capturing inside VM and display it on Linux (host) machine (to have low latency as much as possible --> gaming). At first I was thinking about buying some capture card (around 200€), but I found Looking glass and this is it, what I want (so rather I donate something to devs of Looking glass ). But I have some questions about Looking glass.
Can I use Looking glass if I have:
CPU: i5 4670
iGPU: Intel HD 4600 (for host)
GPU: Nvidia GTX 1060 6GB (for guest)
Is there any screen tearing?
What is latency? I found on that video that it is about 1 frame, but Looking glass should display that frame faster, because GPU is waiting for monitor right? So I don’t know what is current state, if it is faster, or if it is 1 to 1 (speed). I read somewhere that if you have V-Sync on then you have bigger latency (around 1 frame), but V-Sync on host or guest?
What is input lag/latency? As I said I would like to use that VM for gaming, so I need to have low latency as much as possible, or should I use Synergy rather?
How does Looking glass really work? I found that Looking glass communicates on port 5900, so I wonder why? It means that all data (frames) are going through “network”? I don’t think that it is very effective. I thought, that Looking glass transfers frames using shared memory via direct memory (guest) to memory (host) access, or something like that.
What is performance hit when using Looking glass? I mean that what is CPU and RAM usage on client (displaying) and host (capturing) or does it GPU?
What is current state of development? Latest commit on github is from February.
You should try getting pcie/graphics passthrough to work before going down the looking glass path.
There’s generally no latency added and no performance hit, rendered frame buffers travel from guest to host via shared memory (so just some simple byte copies and protection and synchronization, happens in no time).
Looking Glass grabs the framebuffer directly, so looking glass, itself, doesn’t produce any tearing, but a compositor on Linux could. In fact, it’s recommended to run your games (windows side) without vsync when using looking glass to reduce latency.
I’ve seen frames arrive on the host system before they show up on the guest display.
This is an oversimplification, but basically, it captures frames on the Windows guest, copies them to shared memory between guest and host, and displays them on the host.
As far as port 5900, that’s for the Spice channel that handles mouse and keyboard. Just understand that it’s a fully self-contained implementation.
Expect to be able to dedicate one core to looking glass. The GPU hit is not noticeable for me, on a RX 580 or a GTX 1070 ti
It’s a simple tool and Geoff has a day job. He can’t dedicate all his time to it, and there are precious few open issues. I would argue the tool is mature.
I’ve not experienced much more than 30% utilization between the Windows Agent and the Linux “Server”, but some people have experienced more. I was using a worst-case scenario. Probably should have said so.
Shared memory must be RAM unless the GPU supports SR-IOV. That said Looking Glass uses RAM.
oh, that’s a good question. I think it’s reliant on a feature of Windows 10, but I can’t be 100% sure.
Weeelllll… that depends how you define support.
The B chipsets support it on AMD, but only a few boards can actually utilize it by doing some weird stuff with the PCIe lanes. B doesn’t support splitting the 16 PCIe lanes meant for graphics, that is X only.
Well if you had read the second paragraph it would’ve answered this question.
Looking Glass is targeted at extremely low latency use requirements on the local computer … In current testing at a refresh rate of 60Hz it is possible to obtain equal or better then 16 milliseconds of latency with the guest. If the user doesn’t care for VSYNC this can be further reduced to under a few milliseconds on average.
Yes and no. Tearing occurs when the next frame is output before the monitor has finished drawing the current frame. You would experience tearing in LG when you would normally experience tearing with VSync off akin to if you were playinig the game in windows on bare metal. LG does not prevent or cure tearing, its just a copy, if your system was drawing a tear then you’ll get that too.
You need VT-x and VT-d. As with a normal hardware passthrough. According to the google I did on the intel ark page your CPU supports both.
I was talking with Geoff a long time ago, during the closed beta and what happens is that LG on the host side renders the latest full frame with vsync on, and with vsync off (on host) that’s when you can get tearing.