How is Lookingglass support on Ubuntu 18.04? Linux mint to be exact.
Try it and find out
I develop LG on Debian, Ubuntu is simply a bastardised fork of Debian, so things work just fine.
Thanks. I plan on doing it soon. Just wanted to know what I’m in for.
I’m trying to install LG on my Ubuntu Desktop. I’ve been following the guideline on .looking-glass.hostfission.com/wiki/Installation
So far I’ve installed windows client (downloaded from .looking-glass.hostfission.com/downloads) and set up KVM VM
Now I’ve tried looking-glass-client from Ubuntu’s reppos, no way to connect. No debug parameters, no way to solve it. I get a purple screen, no signal.
I’m trying to compile last client from git, but I find a -Werror=maybe-uninitialized error, can’t compile it.
Any guide on how to continue?
Did you install the host application? and the same version to match what’s in the ubuntu repository?
apt list -a looking-glass-client
looking-glass-client/eoan,now 0+b1-1build7 amd64 [configuración-residual]
The only way I’ve found to check the Windows installed version is looking the build file:
Build Commit: 163a2e5d0a1168637da2524717b1328165c3c0b0
Any other way to check compatibility?
I wanted to compile the same package in booth systems, but was unable to do it due to compile errors.
Thank you for your help
Thats B1, it’s tagged. Or just grab the pre-compiled B1 binary from the LG website.
I’m afraid I can’t find the precompiled Linux client, just the source code to compile it myself ;(
Is there a way to tell the Windows Server is correctly running? It’s listed on schtasks but not on taskmgr
Will it work if I run the server on a cmd?
Do I have to remove the Qemu video adapter for it to work?
You have already installed the precompiled client as a distro package, you need to get the corresponding host version for your guest from the Looking Glass website.
Yeah, I installed the Build
Build Commit: 163a2e5d0a1168637da2524717b1328165c3c0b0 from LG website., the official release on my Windows10 Guest
Does it work if I run it con CMD?
I’ve followed the instructions on LG website but found a YT tutto (watch?v=QxaEYOprAqM) that suggests to install the VC_redist in order fot LG-host to run. I did it, no success.
To sum up,
- I followed the LG website instructions
- Balloon device installed (virtio-win10-prewhql-0.1-161). IVSHMEM Device is configured on Windows.
- Added XML snippet on my VM config.
- I can get video from my windows-vm when monitor is pluged to the HW video card
- LG host is set up on windows guest, as a scheduled task (version origin/Release/B1)
- Linux client installed from reppo, same version (0+b1-1build7) as windows host app
When running LG on the WinVM it does nothing (no icon elsewhere). Guess it’s correct
As VM is started, /dev/shm/looking-glass is created correctly. I changed the permissions and executed LG-client
[email protected]:~/user/LookingGlass/client/build$ looking-glass-client -sa [W] option.c:261 | option_parse | Ignored invalid argvument: -sa [I] main.c:996 | run | Looking Glass (debian/0+b1-1build7) [I] main.c:997 | run | Locking Method: Atomic [I] egl.c:187 | egl_initialize | Double buffering is on [I] egl.c:201 | egl_initialize | Multsampling enabled, max samples: 4 [I] main.c:902 | try_renderer | Using Renderer: EGL [I] main.c:1183 | run | Using Clipboard: X11 [I] spice.c:197 | spice_connect | Remote: 127.0.0.1:5900 [I] egl.c:447 | egl_render_startup | Vendor : X.Org [I] egl.c:448 | egl_render_startup | Renderer: AMD TAHITI (DRM 2.50.0, 5.3.0-45-generic, LLVM 9.0.0) [I] egl.c:449 | egl_render_startup | Version : OpenGL ES 3.2 Mesa 19.2.8 [I] main.c:1259 | run | Waiting for host to signal it's ready... [I] spice.c:409 | spice_on_common_read | notify message: keyboard channel is insecure
My VirtualManager viewer goes blank, and LG-client shows purple with a centered logo and no Windows Desktop
Please show the host log also in
There it is, the problem.
The output of
[I] app.c:350 | app_main | Looking Glass Host (B1-rc6-6-gb979752989+1) [I] app.c:351 | app_main | IVSHMEM Size : 32 MiB [I] app.c:357 | app_main | IVSHMEM Address : 0x2AC0000 [I] app.c:366 | app_main | Max Cursor Size : 1 MiB [I] app.c:367 | app_main | Max Frame Size : 15 MiB [I] app.c:368 | app_main | Cursor : 0x2AC0080 (0x00000080) [I] app.c:374 | app_main | Frame 0 : 0x2BC0080 (0x00100080) [I] app.c:374 | app_main | Frame 1 : 0x3B40000 (0x01080000) [I] app.c:381 | app_main | Trying : NVFBC (NVidia Frame Buffer Capture) [I] wrapper.cpp:88 | NvFBCInit | NvFBC SDK Version: 112 [I] app.c:381 | app_main | Trying : DXGI [E] dxgi.c:313 | dxgi_init | Failed to create D3D11 device: 0x887a0004 (Este sistema no admite la interfaz de dispositivo o el nivel de característica especificados.) [E] app.c:398 | app_main | Failed to find a supported capture interface`
Looks my setup is not suitable for D3D11. I’ll look for a fix to it.
Thank you for your time Geoffrey.
I’ll post my solution as soon as I could find it.
Could it be possible to add this help to the installation instructions? Just to help people in my same situation. I mean, looking at
%TEMP%\looking-glass-host.txt log file
Do I have to remove the Qxl video adapter in order for it to work?
I can run LG host on windows when HDML is plugged to HW VM, it shows a task icon and
[I] app.c:350 | app_main | Looking Glass Host (B1-rc6-6-gb979752989+1) [I] app.c:351 | app_main | IVSHMEM Size : 32 MiB [I] app.c:357 | app_main | IVSHMEM Address : 0x2940000 [I] app.c:366 | app_main | Max Cursor Size : 1 MiB [I] app.c:367 | app_main | Max Frame Size : 15 MiB [I] app.c:368 | app_main | Cursor : 0x2940080 (0x00000080) [I] app.c:374 | app_main | Frame 0 : 0x2A40080 (0x00100080) [I] app.c:374 | app_main | Frame 1 : 0x39C0000 (0x01080000) [I] app.c:381 | app_main | Trying : NVFBC (NVidia Frame Buffer Capture) [I] wrapper.cpp:88 | NvFBCInit | NvFBC SDK Version: 112 [I] app.c:381 | app_main | Trying : DXGI [I] dxgi.c:322 | dxgi_init | Device Descripion: NVIDIA GeForce GT 730 [I] dxgi.c:323 | dxgi_init | Device Vendor ID : 0x10de [I] dxgi.c:324 | dxgi_init | Device Device ID : 0x1287 [I] dxgi.c:325 | dxgi_init | Device Video Mem : 2007 MiB [I] dxgi.c:326 | dxgi_init | Device Sys Mem : 0 MiB [I] dxgi.c:327 | dxgi_init | Shared Sys Mem : 2046 MiB [I] dxgi.c:328 | dxgi_init | Feature Level : 0xb000 [I] dxgi.c:329 | dxgi_init | Capture Size : 1920 x 1080 [I] dxgi.c:418 | dxgi_init | Source Format : DXGI_FORMAT_B8G8R8A8_UNORM [I] app.c:282 | captureStart | Using : DXGI [I] app.c:290 | captureStart | Capture Size : 7 MiB (8294400) [I] app.c:292 | captureStart | ==== [ Capture Start ] ==== [I] app.c:65 | pointerThread | Pointer thread started [I] app.c:160 | frameThread | Frame thread started
Then I switch HDMI to my Linux box and run LG-client, no video, and
[I] app.c:298 | captureRestart | ==== [ Capture Restart ] ==== [I] app.c:154 | pointerThread | Pointer thread stopped [I] app.c:234 | frameThread | Frame thread stopped [E] dxgi.c:313 | dxgi_init | Failed to create D3D11 device: 0x887a0004 (Este sistema no admite la interfaz de dispositivo o el nivel de característica especificados.) [E] app.c:304 | captureRestart | Failed to reinitialize the capture device
Do I have to remove QXL video adapter? Any ideas?
Seems my card is supported
Don’t delete the video adapter but set video model=none.
<video> <model type="none"/> </video>
You nailed it 8-P!!
Now I get a clear image of windows Desktop.
Thank you so much!
Could it be added to the installation instructions to help beginners?
One remaining question: How do I change display resolution? I’m always defaulted to 1024x786, no matter the win:size value. Seems Nvidia panel doesn’t recognize it. What’s more, no Monitor is detected (Seems obvious as it’s a headless setup). So, how you guys set it to FullHD mode?
I’m happy I could help. And thanks for my first like!
Yeah, all the software stacks can be a challenge to set up, but it’s worth it once you get it going.
You’re right, the instructions could be a lot better. Unfortunately like many things linux it takes a lot of forum skimming & occasional forum pleas to get things working well if at all.
I’m no expert, but my understanding is that looking-glass guest vm resolution is kinda problematic at the moment. (correct my if I’m wrong @gnif) In general I found I need to set my guest resolution in the client (don’t forget to increase shm size if you’re going with higher resolutions!) and in order to avoid bitmap stretching (ugly artifacts) I needed to use the same resolution in the client window as the guest vm. But then doing that and going to full-screen is an issue since the app running in the guest vm isn’t currently capable of changing the vm resolution. (wish list++) Currently I’ve settled on running the guest at native resolution & mostly running in full-screen mode, using windows virt-viewer for convenience to manage my other vm’s from within the native GPU vm, and only occasionally jumping out of full-screen mode & minimizing the looking-glass window when managing the linux host. Not that stretched graphics is that bad, mostly fonts seem to get mangled a bit.
All that being said, I don’t have your problem - I have both DP and HDMI outputs hooked to inputs on my dell U3415W monitor. So windows detects the monitor.
If that was unclear (it was): try manually setting the resolution either in the windows display settings or in the nvidia control panel to your desired setting, then set the client window to the same. You should be able to force a custom resolution in the nvidia control panel if nothing else works. I’d suggest starting out setting to your native monitor resolution and going into full-screen mode (scroll-lock + F keys) then capture input devices (scroll-lock). Get that going and build from there.
I’ve managed to solve it by hard-coding the resolution on Windows Registry, following the instructions at this link. It seems a problem caused by Nvidia drivers.
Yeah, I wish it could be easier, yet @gnif has done such a great tool for us. I’m using it to run Fusion 360, as it doesn’t run on Linux ;(
A quick tip for newcomers.
I found very useful this guide about CPU config on virt-manager on ArchLinux
Seems Windows goes a little crazy when not pinning CPU to cores, and starts freezing every 3s.
Hope to help you