Return to

Just a little update on LG progress since B4

Development has continued on LG at an astonishing rate in the 18 days since the B4 release, with new features such as ImGui integration and the awesome work by Quantum and xyene with damage tracking the next release is on track to be nothing short of amazing!

We have added a new feature called jitRender (Just-In-Time Render) which while at the moment is considered highly experimental, is extremely exciting! jitRender it is able to completely eliminate any latency caused by the GPU driver buffering frames, even with vsync on giving the best of both worlds… zero tearing with zero additional latency.

Here is a little sync demo under X11 with vsync on! Note that once the jitRender calibration we have to do under x11 is done (you will see this as the little spike in the top timing graph), we are always 1-2 frames faster than the physical GPU output to get the frame to screen.

Recorded at 100FPS, slowed to 30FPS. Resolution is [email protected], NvFBC capture in the guest, egl:vsync=on, VM to VM without DMABUF.

Last night I implemented the new configuration and informational dialog, squashed a ton of bugs with the new overlay mode, and wrote the infrastructure needed to easily add additional overlays/widgets.


The next release will be an exciting one!


Something new…



After (Sharpness @ 1.0):

Next on the agenda… AMD FSR


Use AMD FSR on LG’s Linux client to reduce the amount of computation on the Linux client?
To improve the experience of Linux clients using low-performance graphics cards?

Man, this looks good.

Personally, that tiny touch so the UI synergizes with Unity Engine default dialogs and button designs would give it the genuine polish it deserves. It’s not that hard actually, Unity default UI boxes are easy to re-create.

Example, pressing F1 in Subnautica. That should be super easy to model buttons and menus for the overlay off of.

No, to improve image quality on old titles that only render at low resolution, FSR is a general upscale algoritim and does a magnificent job at this. We also have some users already using FSR because they run the guest at a lower res for streaming to twitch, and then upscale in the client to fill their screen. There are some very very cool uses of FSR (See below).

For those that are bandwidth constrained due to using an iGPU, it should help too as they can now run their guest at a lower resolution and upscale it to fill the screen (Note, performance on an iGPU has not yet been verified).

Comparisons with LG’s FSR:

BF5 @ 1024x768 → 1600x1200 (First image has FSR off)

CS:GO @ 1920×1080 → 2560×1440 (First image has FSR off)

Utawarerumono @ 1280x720 → 2560x1440 (First Image has FSR off)

StarCraft @ 800x600 → 1600x1200 (First Image has FSR off)


We are working on it, right now focus is on core implementation and integration, later we can worry about styles, etc.


Current like the OOTB i3 scheme. I’m happy with it :wink:

But I like blocks and hard corners, so…

I got my 5700g installed yesterday, so I’ll hopefully be back on that LG game on the weekend when I have time to set up the VM.



It seems that the effect is really good. This is a bit like supersampling. If I use 1080P on the VM, after enabling FSR, the LG of HOST can also get an improvement in image quality, right.

Firstly let me address the nomenclature:

The LG Host is the application that runs inside the VM
The LG Client is the application that connects to the host

Think server/client. This distinction is important as LG can operate in VM to VM mode in which case both the client and the host applications are running inside VMs.

To answer your question, only if it’s actually up-scaling and you need it to upscale yes, you will get an improvement.

Since some titles upscale to the desktop resolution before output such as remastered games like StarCraft 1 FSR won’t help you unless you can reverse this upscaling. For this today I added another filter, a downscaler…


768x480 → 1200p with CAS+FSR:



It sounds like the remake of StarCraft uses a resolution upgrade algorithm similar to FSR, so FSR is invalid for it?

This is a game you are developling? Or one you are playing?

Not at all, Starcraft are literally just pixel doubling, no intelligent scaling at work at all, thus the output is quite terrible.

That’s StarCraft by Blizzard mate

1 Like

hahaha. never have played starcraft. so i derped.

Bro, this is the Looking Glass thread. It’s the frame relay program for passthrough vms.

1 Like

Lol. LG = LOoking glass. I gues I learn something each and everyday.

1 Like

From the current point of view, the pixel-style RTS game StarCraft really looks like an indie game.XD