Looking Glass - Triage

Newb-ish question. I can’t for the life of me figure out how to install the ivshmem drivers on Windows. I’ve downloaded the zip file provided on that GitHub thread, but what do I do with it?

Edit: Download devcon and run the command
devcon.exe install ivshmem.inf "PCI\VEN_1AF4&DEV_1110&SUBSYS_11001AF4&REV_01"
The inf file has this in the comments at the top, but be sure to include the quotation marks because otherwise the windows cli parses it weird.

Looking Glass captures the monitor resolution. Not the client resolution like OBS’ Game capture does. Looking Glass captures the entire monitor all at once, not directly hooking into the game.

Right, which is why I’m running 3440x1411 both as the resolution of the guest as well as the client so they match.

No, you don’t get it. Looking Glass is a layer above the window manager. It acts more like a Display Capture in OBS compared to a Game Capture.

If the window is 3440x1411, Looking Glass captures the entire monitor anyways at 3440x1440, including your taskbar.

Unless you were somehow able to magically use a crop function in Looking Glass I’m not aware of, I don’t see the raw buffers ending up at that resolution.

Looking Glass in it’s current state likes 1:1 pixel structure borderless fullscreen, with both monitors the same resolution, as the best presentation. But there is no ROI selection for the host with custom resolutions, nor cropping in the client.

1 Like

Right, but again, I’m not running 3440x1440, I’m running 3440x1411 in this state.

Screenshot%20from%202018-11-18%2011-49-47

If I run with -F flag instead of -w 3440 -b 1411 a full screen window of 3440x1440 is presented with black bars at the top and bottom because the image inside the window is 3440x1411. If I run without -F and without -w 3440 -b 1411 and just rely on -a to auto resize a window of 3440x1411 appears. This is true for both opengl and egl.

Having said that, does it rely on the active signal resolution or the desktop resolution? It seems to me like it’s using desktop resolution, OBS also says 3440x1411 for display capture.

Screenshot%20from%202018-11-18%2012-03-42

I did some tests at 3440x1410 because I was curious if 1411 being an odd number was throwing it off and it looks like with that resolution both opengl and egl look the same with no blurry text, so it did play a factor. I can see the a little line between the window and my top bar since it doesn’t fit perfectly anymore but not enough to be jarring.

Overall for my setup the opengl performance is still noticeably smoother and uses less CPU, but I can see a potential tradeoff by using radeontop to view the GPU usage that opengl uses more GPU than egl does. It may run smoother in this test specific to WoW as it is very CPU reliant with its old engine; I would need to check with a couple other games what seems to run better.

It uses Active signal resolution. Not Desktop resolution. Windows 10 is adding to the confusion here. Windows 10 doesn’t count the taskbar in it’s Desktop resolution.

Effing Windows 10. (GET BILL GATES IN HERE)

I seem to recall something about that too. Really odd choice. In any case I’ll be sticking with opengl for now, time will tell if I should switch. :slight_smile:

It could be a driver level issue too. Check if you aren’t running any weird custom resolutions, and let your DISPLAY do the scaling, and not your GPU.

EGL is OpenGL, it’s just OpenGL ES. It’s still early and incomplete which is why you’re seeing this issue. However the issue for you is still present even with OpenGL, it’s just masked by the mipmapping that’s enabled by default.

Got it. Thank you for the information!

Commit ab98c87e7c4797ba5bb049a7deca84d964058bcb

Missing lg-fonts.c in the commit? I don’t see it on github.

CMake Error at CMakeLists.txt:68 (add_executable):
  Cannot find source file:

    lg-fonts.c

Fixed, sorry about that.

1 Like

Hi All. Experiencing low UPS numbers in Battlefield V when future frame rendering is turned on. Currently using looking glass build from source and the egl renderer.
Getting numbers around 30-40 UPS.

However when its turned off the UPS equals the amount of FPS i get in the game.

any idea what may cause this issue?

Well there is your answer, you are loading your video card up with too much work to both capture and render at the same time.

1 Like

First of all, love the project!

I was wondering; I’ve always gotten close to good performance out of LG, but I’ve never had it good enough so that I can just play games (that require precision) though it. My UPS is usually above 50 when playing a game, sometimes even maintaining 60. But for no obvious reason it fluctuates in performance without showing huge dips in the UPS nor FPS.

Currently on the latest git build as the v11 is even worse performance-wise (for me). I’m using the parameters -sladM -K 60 (although letting it be 200 doesn’t change performance all that much), vsync=0 and of course EGL. My CPU usage never hits 100%, nor does my GPU (raw output though a monitor is perfect, the performance seems to be only aesthetics? - as in LG doesn’t perform as well as what it does normally). But I have noticed some minor performance gains/losses when tweaking my VM parameters (only noticable though LG).

So is there an optimal VM configuration specific to LG? Or could it just be my hardware or something? (i7 6700k @4.5, 1070 OC, seperate drives for VM, etc., host is using the iGPU, which I’ve tried OC’ing with no noticable affect) EDIT: resolution is 2560x1080

I know it’s very early days for LG, but it just seems strange. Keep up the great work! :smiley:

I had similar performance problems in LG on my iGPU (8700k). I bought an 1050 ti to use instead for my host and problems went away. I did however notice when I uninstalled the bumblebee drivers I was using for my iGPU in Manjaro and instead used the Nvidia driver, LG performed much better on the iGPU aswell.

Too high, the windows video capture rate can’t keep up at those resolutions. This is not a Looking Glass problem.

1 Like

Oh I thought people were getting decent performance at even higher resolutions than mine. What is the highest resolution that the video capture can keep up with?

EDIT: Just tested Overwatch with both 2560x1080 and 1920x1080; didn’t notice the slightest performance increase.

Hello!

I’m running into a snag with the looking glass host on Windows, though I assume this is more of an issue with something on the Linux side. I’ve gotten looking glass working on older hardware a few months ago but this is a new install. I feel like I’m missing something obvious so hopefully I just overlooked something.

This is the error I’m getting from looking-glass-host on Windows 10:

[I]     CaptureFactory.h:83   | CaptureFactory::DetectDevice   | Trying DXGI
[I]     CaptureFactory.h:86   | CaptureFactory::DetectDevice   | Using DXGI                                             
[E]          ivshmem.cpp:64   | IVSHMEM::Initialize            | Unable to enumerate the device, is it attached? 
[E]          Service.cpp:57   | Service::Initialize            | IVSHMEM failed to initalize

I have this included in the devices section of my libvirt config:

<shmem name='looking-glass'>                                                                                                                                              
   <model type='ivshmem-plain'/>                                                                                                                                           
   <size unit='M'>64</size>                                                                                                                                                
   <address type='pci' domain='0x0000' bus='0x00' slot='0x0b' function='0x0'/>                                                                                             
</shmem> 

I’m not sure what the “address type=‘pci’” bit is doing in there but it’s being added automatically when I start the VM. Permissions seem to be correct for the shared memory file:

.rw-rw---- 67M thnikk kvm 30 Nov 20:13 looking-glass

And lastly, I have a 2700X running on an MSI B450M Mortar. The host is using a GTX 660 Ti and the VM is using a Vega 64. I’m running Arch on the host and Win 10 on the VM.

Any help would be greatly appreciated.

Edit: I also read that someone else had the same error and it was fixed when they installed the driver. I double checked and it looks like I have the right version:

The baloon driver has nothing to do with IVSHMEM, you need to install the IVSHMEM driver for the PCI standard RAM controller

2 Likes