Boot GPU no EFI / Tianocore spash screen. Shows up only on Windows login screen

Hello everyone.
New to this forum but have been browsing here for a long time now. I have managed to successfully pass through 2 GPUs to 2 different guests at the same time.
Specs:
Host OS - Pop!_OS (systemd)
Guest OS - Windows 10
Mobo - ASUS Crosshair VIII Hero Wifi
CPU - AMD Ryzen 3950x
GPU 1 - RTX 2070 (boot gpu) for guest 1
GPU 2 - GTX 560 for guest 2
GPU 3 - some AMD Radeon 5k/6k (don’t know exactly) on the chipset slot for host

Using Pop!_OS How-to tutorial I got it work (granted had to improvise with systemd and initramfs script not wanting to run by itself neither on pop or on ubuntu for some reason). GTX 560 took some vbios rom patching and RTX 2070 some rom editing. 560 boots beautifully into VM with OVMF splash screen (haven’t properly tested reset) but 2070 just refuses to show that splash / boot screen all the way till Windows login screen shows up. I have disabled efi-framebuffer in systemd boot config and almost all of the devices get vfio-pci module assigned to them (except usb controller). After the system boots I have setup X11 script to use AMD Radeon graphics card for GNOME which leaves blank black screen until Windows gets to the login screen. Shutting down VM turns the screen off but powering it back on doesn’t turn the screen on till the same login screen.

While it seem like there are absolutely no side effects to this problem I was wondering if there is any way to fix this problem so that 2070 would show the splash screen / frame buffer in case it decides to fail at boot ? (like it happened once when virtmanager decided to switch to secure boot ovmf for some reason).

Thanks in advance

Edit / Update: solution is the post by me. Basic gist: it was my keyboard that also had a hub in it.

Do you have a virtual video device on the guest associated with the 2070? I’ve had that happen before where the splash screen was showing on virt-manager’s graphical console and I didn’t know. What screens pop up when you go into Windows display settings?

There are no other graphical interfaces associated with that vm. I have even removed serial and confirmed with device manager that there are no other display adaptors.

Edit: I just tried rebooting 2070 guest and it failed to start but did show console dash before it froze. I think I either got almost ok ROM or capability mode settings different to when it was rebooting fine. Need to try dumping ROM in Linux rather than gpu-z

Edit2 / Update: figured out what was causing it. It was my Das Keyboard using USB pass through. Keyboard has USB hub built in. I noticed from the start that it never sticks to vm config (always switches bus id) and later on I would always add it while machine was running. At first it worked fine but later passing it while machine running would lock up the machine. Switched to a much cheaper and more basic keyboard solved all the problems.