Looking Glass - Triage

Did you change it from spice to vnc on the vm?

Actually, I donā€™t have any Spice server on the VM (and never had).
Humā€¦

Then disable SPICE in the client (-s), clearly you have something on port 5900 that itā€™s trying to connect to.

It did the trick!

I actually run the default Vino VNC server on Gnome, which indeed is listening on port 5900.

Reading the help menu on LG, it says the (-s) option is to ā€˜enable the built in SPICE client for input and/or clipboard supportā€™

Yeah, and what is the value column? Itā€™s a option to be toggled, default is on.
Sorry that was a bit rude :slight_smile: let me rephrase it.

The option value column states what the option is currently set to. For spice the default is to be enabled, and -s disables it. Alternatively you could use spice:enabled=false

Do this at your own risk,

but Iā€™ve enabled the experimental spice multiclient support to prevent interference from virt-manager, which closes LookingGlass whenever I change the machine via virt-manager while itā€™s running. So far no issues except when I have a qemu video interface enabled at the same time (which I normally donā€™t except for emergencies.)

https://www.spice-space.org/multiple-clients.html

libvirt:

<qemu:commandline>
  <qemu:env name='SPICE_DEBUG_ALLOW_MC' value='1'/>
</qemu:commandline>

or
shell/script:

$ SPICE_DEBUG_ALLOW_MC=1 qemu-system-x86_64 arguments
1 Like

These issues are fixed as of 3d426cce, you will need to upgrade both the client and the host to see this resolved.

Tested 5e201a32 yesterday because the asynchrones stuff sounded interesting.
Sadly I still see the heavy loss in FPS as mentioned previously.

Not sure if this was to be expected but though Iā€™d keep you updated anyways.

That said, in your post you explicitly mention ā€œNon-windowed full screen capture is now fastā€.
Is this now preferred over borderless window (which is what I was testing, forgot to try explicit fullscreen)?

The DXGI capture change has made a very big difference from what I can tell. I was actually using the scan line flushing with RTSS another member posted about (forgot their name and would have to search for the thread) because it made a difference for me at high resolution for perceived smoothness and displayed UPS. You mentioned that it added a little bit of latency but not enough that I noticed, the difference it made for playability was noticeable. In doing some testing with a couple games with the latest beta it either didnā€™t matter if it was on anymore, like in WoW which doesnā€™t get consistent FPS anyway but before I used it and before the latest beta the stutters were apparent, it was much smoother enabled; or actually made it worse, like in witcher 3 where I went from 50 FPS to 40 FPS, so now Iā€™ve left it disabled since it seems no longer needed.

The Talos Principle still plays better with it on unless I limit the FPS to around 80 or so, but I never saw 100% GPU usage either way. I mentioned this game in the other thread too because of its benchmark. With it disabled the benchmark will average around 97 FPS but I can see the stutters and with the FPS display on the UPS are on the lower end, whereas with it enabled the benchmark will average around 80 FPS and itā€™s much smoother with much higher UPS, closer to or matching the framerate of the game. This was the only exception in my limited testing, other games were either smooth enough that I couldnā€™t tell the difference with it on or off, or werenā€™t maxing out my GPU - but previously those games either would have some stuttering or would fully load the GPU so the improvement is definitely there.

EDIT: It also didnā€™t really make a difference to FPS of the game(s) or UPS of the client whether the game was full screen or borderless. WoW doesnā€™t even give me the option for full screen, I think they removed it either with DX12 renderer or they removed it entirely, I didnā€™t try with the DX11 renderer Iā€™d have to check, the DX12 renderer is better anyway ~15-20 FPS improvement.

Please update the host to B1-rc5, there were still some minor issues that had to be resolved. However I believe your performance issues are not so much LG related, but a limitation of your hardware and/or configuration.

There is no longer any preference, whatever works best for you.

Actually it shouldnā€™t add any unless there is a hiccup in performance due to other system tasks, etc, and it will only be for a frame or two.

Correct, in earlier versions this was an issue which is now resolved.

Depends on your definition of limitation.
Yes, these games (FFXV and Skyrim) are pushing my hardware but again, parsec does not cause any FPS loss.

As youā€™ve mentioned before, FPS loss is to be expected given how LG does not buffer and (previously) had to stall the GPU.

Now that the stalling issue has been resolved (awesome work btw.) I was hoping to see some gain but oh wellā€¦

I will for sure test B1-RC5 (and all other upcoming builds for that matter) :slight_smile:

According to https://ko-fi.com/post/Massive-DXGI-Performance-Boost-W7W2W6YN

It means that:

When running games in fullscreen and windowed there is same performance now?

DXGI has same performance (speed of capturing frames) as NvFBC, or in some situations better than NvFBC? It is interesting that DXGI can be faster than NvFBC :open_mouth: I donā€™t understand why :smiley:

Will be there any benchmarks comparison between DXGI before performance update and after performance update? :slight_smile:

From my testing, yes.

Because the DXGI capture interface provides both cursor shape updates AND cursor position information in each update. NvFBC does not, it only provides cursor shape and we have to rely on a system wide hook to obtain cursor position information which is slower.

Perhaps, I am looking to release a video to coincide with the B1 release that will cover all the changes since A12, etc.

1 Like

That member wouldā€™ve been me. I can vouch for the performance improvement of B1-rc5 vs A12 to the point where RTSS flushing is almost no longer required. There are very few games where I have UPS performance drops, and Iā€™ve pinpointed the issue to the GPU when itā€™s being utilized to 100%. This only happens when I look at certain geometry and it happens so exceedingly rarely that I have to go out of my way finding certain sections to trigger it.

This is the only time where RTSS scanline flushing would help, and even then, there are some sections where 100% utilization cannot be avoided - all it can do is just reduce instances where it happens by giving some GPU headroom due to it reducing performance.
Even so, its not really a problem of Looking Glass, but rather the guestā€™s graphics hardware, so an upgrade would solve the issue. Which is now something I have to do for that consistent 120fps.

Iā€™m also glad that B1ā€™s CPU performance is so much lighter that itā€™s almost indistinguishable from not running it at all. Compared to A12ā€™s performance which reduced CPU performance by almost 5% in CPU benches, and spiked LatencyMon consistently, this is incredibly welcomed.

2 Likes

Are there still performance issues when using integrated GPU on the host? Iā€™m thrilled to see if I can get playable frame-rates at 1440p with the beta RC, I just need to know if I should pick up a cheap discrete GPU for the host on my way home.

Thanks for chiming in! Yes I agree itā€™s almost completely not needed but definitely still depends on the game. I did some more testing last night and I re-enabled it for WoW because there are times where the GPU is at 100% and I can see it stuttering but itā€™s not a consistent FPS game for various reasons, whereas when I was playing The Witcher 3 at 50 FPS solid at 100% GPU usage and the UPS basically matched I couldnā€™t tell it had any issues at all.

Not much of an upgrade path with a 1080 Ti, at least not one thatā€™s worth it. :stuck_out_tongue: If I cap some of the games at a slightly lower FPS than 100 like around 80 or turn down some details then sure I can get a consistent experience without RTSS flushing but as you said itā€™s not the fault of LG, the hardware is doing as much as it can so it just depends on the game. Overall a big improvement though!

I donā€™t have any iGPUā€™s to test with so I canā€™t comment on any improvements here, I can however state that if your FPS (not UPS) is low on the host, then your host GPU is the cause.

1 Like

When looking-glass-client is ran (from a12 to B1-rc*), the monitor of the host PC canā€™t go to sleep (using stock Ubuntu with Gnome).

Aborting the client LG process would then result in normal behavior in terms of screen power management e.g. Blank screen after 15 min.

Any ideas?

I just pushed in a fix for this.

Once published on the downloads page, Iā€™ll give it a try, thanks