Looking Glass - Triage

Some DEs treat the window better/worse depending on if it is bordered/borderless, etc… have a play around, it is possible to force it to launch in a specific location, I use it to make it span three monitors under i3.

I am not running the client with vsync. This is how I’m launching:
looking-glass-client -s -k -F -o opengl:vsync=0

My guest GPU is an Nvidia GTX 1070 Founders Edition.

Based on my research I think I should be able to get slightly better numbers. What’s interesting to me is that the UPS # is so steady around 60, and never above 60 exactly. (Except when running the Teeworlds game) (All games have vsync disabled, however).

I am sorry but it’s been quite some time since I was running A11 now, there has been some changes in the latest master version that may resolve this issue for you, I do recall patching something that might have fixed this a few weeks back. You can try the latest version if you like, but for now there is limited/no support for it.

I’ll grab the latest windows host binary & client source and report back tomorrow. Thanks.

Grabbed looking-glass-host & looking-glass-client from latest commit:
d235d076c47e404ed4617f2575847dce81419633

Still experiencing a cap of 60 UPS in CS:GO, Rocket League, StarCraft 2, etc.

  • Games run in the guest over 100 FPS consistently
  • looking-glass-client FPS is 195-200
  • looking-glass-client -s -k -F -o opengl:vsync=0

In the game called “teeworlds” game I can pull 500+ UPS. However, in all other games it seems like I can reach 60.9*** but never higher than that. Odd.

Please try the egl renderer…

-g egl

Performance seems about the same, possibly a tiny bit better. Unfortunately the FPS/UPS meter does not appear to work in that mode:

  • looking-glass-client -s -F -k -g egl -o egl:vsync=0

While I see the same behavior as gyblin, no FPS/UPS meter, I am impressed by the egl renderer: When using opengl I usually have looking-glass-client at 160% of the CPU on top, 2950x, basically almost two CPU resources being consumed. This hihg cpu utilization is always seen even when not gaming, albeit it hovers around 80%CPU (again, top cpu utilization)

Using egl it tops at 20% even when gaming at 2560x1440, and when the system is idle it goes to 5%. I am not sure if this is just the expected behavior from egl (which does eliminate a lot of unnecessary calls) or if you actually coded it differently, but I am really glad you took the time to add it.

2 things to try I can’t remember the flag but there is one in the client to set max fps I assume you have a 144hz monitor so set it to that. second is you may be getting locked at 60 by your desktop compositor. especially if using gnome or a DE based on gnome. finally make sure you have got your refresh rate set correctly in your display setting in linux. it often doesn’t like to save.

check the changelog thread.

A bit of both :slight_smile: it will become the default renderer in A12.

This may be because of differences with my setup but I actually find the opposite, the opengl renderer will use about 20% cpu while the egl renderer will use about 30%. It also looks noticeably worse in that it doesn’t look as sharp and this can be seen in screenshots, and that -may- be due to running a slightly odd resolution to fill the space under the gnome top bar rather than run full size. Considering the resolution it still runs really well, completely playable.

EGL:


OpenGL:


hmm, I think it may have to do with my -K 144 flag: OpenGL goes insane, EGL works very well… When I run with -K 60 OpenGL behaves better

I have never noticed the sharpness, I will take a look on mine but I am traveling this week and doing an RMA on my memory as well… may take a while.

Sharpness is potentially due to the new 4:2:0 chroma subsampling support. EGL supports 4:2:2 and 4:2:0 chroma subsampling typically used in video codecs.

No, it’s because his pixel ratio is not 1:1 and it’s performing mipmapping. EGL handles this differently as I have not yet properly added this feature back to it yet.

Sorry just to clarify on this, do you mean trying to run 3440x1440 in a 3440x1411 window is causing that or do you mean something else? I have set a custom resolution of 3440x1411 to match the client window size so it should be 1:1 if I understood what you meant correctly.

3440x1440 in a 3440x1411 window

Correct, anything that is not exactly 1:1 will cause texture mipmapping/scaling, which is why you’re seeing a drop in sharpness, even that one pixel difference is enough.

1 Like

Strange then since the resolutions match.

How is that the same?

I have set a custom resolution of 3440x1411 to match the client window size. I don’t run 3440x1440 unless I switch to using the monitor for the guest.