Imagine if there was LG like custom PCI card for example with HDMI IN port, and some chip on it which just grabs frame from HDMI IN (from graphics card) into some shareable memory with CPU (for host), like dream
It’s called a “Capture Card”, they exist. Please don’t take this thread off topic, this thread is for LG Triage, not general discussion. If you wish to discuss LG outside of help and support, please create a new thread on this forum. The community here is very welcome and open to discussing idea’s and concepts.
Dumb tip for anyone else sitting with a black client window and no errors:
Maximise the client window!
(If this has been covered already, please delete this. I’ve embarrassed myself enough already!)
This is a documented issue that is likely fixed by the patch in:
lol, thought it might be! But it’s working now. Hopefully other noobs will find this.
So after posting to the wrong place before see Looking Glass client outputs purple screen for reference I have now downgraded my looking-glass-client using the AUR package rather than the git version and installed my nvidia drivers for my 970 but I’m still just getting a purple output on looking-glass-client in either mode but my second monitor displays correctly.
Oh looks like my ivshmem driver didn’t install in Windows properly so that could possible be the cause.
It certainly will be, the host can’t communicate with the client without it.
OK, so I’ve installed the ivshmem driver again by right clicking on the inf and installing and the ivshmem-test shows everything as passing now looking-glass-client is connecting in both Spice and non-spice mode but the output is just a black screen but the looking-glass-host does seem to show something is happening.
Am I running the host correctly as currently I am using an Admin rights cmd and running looking-glass-host.exe.-f which seems to be the only way I can get it to run but want to make sure this is correct.
Suddenly dawned on me when I was reading through for 1000 odd posts that someone said that AUR packages aren’t supported which made me think I wonder if the cilent and host had a version mismatch and low and behold that was the issue.
Now for the fun stuff let’s see what the performance is like.
Yeah, some people have had luck with the AUR package but its definitely a YRMV.
@gnif has said numerous times that the only thing which is officially supported are the tagged releases from github. There are some few who are some kind alpha testers who help test his master builds but that is not recommended.
Downgrading to A11 works fine so the AUR just needs updating but in the mean time I’ll update the ArchWiki with the correct information so at least the next person to run into this will know what to do and saving asking the same question.
One of the things we’re trying to change is making this documentation more obvious. As a new user, please check out my UX thread. I would like to hear your thoughts.
Well if you need any help with the documentation I’d be happy to help with that to give something back to the project.
I think a wiki style might be really useful here as it means users can easily share their solutions and it will always be up to date.
I have a few questions about looking glass. The before having touched it / looking for a gpu (probably) type of questions.
- I always thought it copies the vram to the other cards vram. But I´ve read somewhere that it actually uses RAM. Depending on how that works out performance wise I´d prefer that, since it would allow me to more easiely use a “not so great” card for the host and have all the vram of my guest card useable. Also is that configurable? Meaning if I had enough vram can I make it use the vram of the card instead of ram? Or is that only applicable when using an igpu (since it was gonna use RAM regardless).
- If I use an AMD card for the guest, I could also still use gsync with looking glass if my host card is an nvidia, right? Might come to that depending on how Navi turns out to be. Might also be better since they don´t restrict virtualization as Nvidia do. So I won´t have to workaround artificial limitations NVIDIA put in place. Kinda s*cks to be softlocked into one brand by the monitor choice.
It´s still gonna take some time till I decide if I even wanna try that to begin with. I´m like eventually-probably-maybe-kinda wanna do it type of decided.
It does, but by means of a system memory copy. Unfortunatly there is no way to transfer directly between the GPUs. As for performance, it’s fast enough to transfer 1000+FPS, however it is limited by the windows capture performance.
This is the idea, you can use an integrated GPU for the host provided it supports OpenGL, which these days everything does.
It depends on what you mean. If you’re using a physical monitor on the guest with GSync, it will work as per normal even with LG running. If you’re trying to make GSync work on the host via Looking Glass, no, it is not possible as the LG client that runs on the host is rendering the captured image using the host’s video card, not the guest’s.
Would looking glass work on somethitg like my msi gs63vr? Its skylake and has a 1060.
I ment the Windows guest uses an AMD card and the Linux host my existing NVIDIA card connected to a gsync display. So this won’t work, because the guest card does all the rendering? But it would work if the guest has an NVIDIA card as well with the display connected to the host? Or is there no way for it to know that there even is a gsync display available and it has to be connected directly to the gpu of the guest either way for gsync to work?
I have a MSI Z170 M5 and a Skylake so I would like to think you are OK. Have you tested IOMMU support out yet?
No I just wanted to know if it would be a possibility… Seems like it is?