Development has continued on LG at an astonishing rate in the 18 days since the B4 release, with new features such as ImGui integration and the awesome work by Quantum and xyene with damage tracking the next release is on track to be nothing short of amazing!
We have added a new feature called jitRender (Just-In-Time Render) which while at the moment is considered highly experimental, is extremely exciting! jitRender it is able to completely eliminate any latency caused by the GPU driver buffering frames, even with vsync on giving the best of both worlds… zero tearing with zero additional latency.
Here is a little sync demo under X11 with vsync on! Note that once the jitRender calibration we have to do under x11 is done (you will see this as the little spike in the top timing graph), we are always 1-2 frames faster than the physical GPU output to get the frame to screen.
Recorded at 100FPS, slowed to 30FPS. Resolution is 1200p@60Hz, NvFBC capture in the guest, egl:vsync=on, VM to VM without DMABUF.
Last night I implemented the new configuration and informational dialog, squashed a ton of bugs with the new overlay mode, and wrote the infrastructure needed to easily add additional overlays/widgets.
Use AMD FSR on LG’s Linux client to reduce the amount of computation on the Linux client?
To improve the experience of Linux clients using low-performance graphics cards?
Personally, that tiny touch so the UI synergizes with Unity Engine default dialogs and button designs would give it the genuine polish it deserves. It’s not that hard actually, Unity default UI boxes are easy to re-create.
Example, pressing F1 in Subnautica. That should be super easy to model buttons and menus for the overlay off of.
No, to improve image quality on old titles that only render at low resolution, FSR is a general upscale algoritim and does a magnificent job at this. We also have some users already using FSR because they run the guest at a lower res for streaming to twitch, and then upscale in the client to fill their screen. There are some very very cool uses of FSR (See below).
For those that are bandwidth constrained due to using an iGPU, it should help too as they can now run their guest at a lower resolution and upscale it to fill the screen (Note, performance on an iGPU has not yet been verified).
Comparisons with LG’s FSR:
BF5 @ 1024x768 → 1600x1200 (First image has FSR off)
It seems that the effect is really good. This is a bit like supersampling. If I use 1080P on the VM, after enabling FSR, the LG of HOST can also get an improvement in image quality, right.
The LG Host is the application that runs inside the VM
The LG Client is the application that connects to the host
Think server/client. This distinction is important as LG can operate in VM to VM mode in which case both the client and the host applications are running inside VMs.
To answer your question, only if it’s actually up-scaling and you need it to upscale yes, you will get an improvement.
Since some titles upscale to the desktop resolution before output such as remastered games like StarCraft 1 FSR won’t help you unless you can reverse this upscaling. For this today I added another filter, a downscaler…