Regression: Windows 10 VM GPU doesnt work unless display is tuned to the VM

trying to get looking glass working good on my new VM setup.
back when i had a GPU that had a DVI output, i just plugged the windows VM into the DVI input on my monitor, and simply kept the monitor tuned to the host, and this setup worked flawlessly.

now i have upgraded to a Navi for passthrough, which has no DVI. i have a TV with HDMI input i use as a secondary monitor on my host. so i plugged the VM into the second HDMI on the TV and keep it tuned to my host.
this does not work, Windows 10 seems to only ever use the GPU if the TV is actually tuned to the VM.

is there any fix for this bug? having to tune a display to the VM means i cant use that display for my host, and defeats the entire purpose of looking glass.

You’ll probably need a HDMI dummy plug.

1 Like

i want the option to tune a display directly to the VM if needed.
like when Looking glass Host program crashes, which is very frequent.

You can still do that with a dummy plug. Put the dummy plug into one HDMI out, and the TV in another. Then set windows to mirror the display.
If you don’t have 2 HDMI Outs or you don’t want to use them, there are also Dummy Plugs that optionally have an HDMI out of their own, so you can plug a real monitor into the dummy plug. When the monitor is off the dummy will take over. I don’t know how reliable they are though. Another option would be an HDMI switch.

HDMI doesn’t seem to “broadcast” that a device is connected when it’s off like DVI does (I have similar issues with my HDMI and DP displays, but more X11-fuckery then anything else), so AFAIK there’s no way around this issue.

i found a way around it. i purchased a DisplayPort to DVI adapter and used the DVI input on my monitor.