Return to Level1Techs.com

Passing through Elgato Capture Card

Hi all.

I recently switched from Windows to Pop OS on my daily driver desktop. I’m not new to Linux. I use Linux on the server side at work and this is the third time I’ve attempted to switch to Linux on my home desktop, but I ran into major issues I didn’t have the time to push through in my past attempts. I’m committing this time to doing what I need to remain on Linux this time around.

I ran into issues with my Elgato HD60 Pro, which doesn’t have Linux drivers. Some “open source” drivers (which really meant converting the proprietary Mac drivers for Linux use) were being developed on Github, but the project was abandoned before support for Elgato HD60 Pro was added.

I use the capture card for streaming from my Analogue Super Nt to Twitch and for taking the sound from that to my speaker/headset during gameplay.

I came up with hair-brained scheme to use PCIe passthrough to run the capture card in a Windows VM and use OBS’s NDI plugin to send the stream data from the VM to my main PC’s OBS.

I followed Wendell’s VFIO guide, enabled VT-d in my BIOS, enabled the intel_iommu kernel option, regenerated the initramfs image with the VFIO script to enable VFIO drivers for the capture card, and added the PCI device to the VM. I’ve run into the following problems:

  • From clicking the button to start the VM until I get visual confirmation that the VM is started takes 1 to 2 minutes, is this normal for PCI passthrough setups?
  • Upon shutdown of the VM, my host machine crashes.
  • Windows is able to see the card and install drivers, but has a “No Signal” indication at all times. I’ve tried keeping my Super Nt running before boot and restarting it while the capture software is running in the VM, but I’m never able to get video output in the capture software.

Any idea what I can try next? I’d rather not resort to dual-booting because I know how I am with that. I’ll end up not switching OSes and not learn how to overcome the problems I run across in Linux.

Don’t know if that is an option for you but I can confirm that the Elgato Camlink HD and 4K both work perfectly fine under linux. There must be a USB standard for that, I guess.

1 Like

That’s definitely good to know. If it comes down to it, I may end up spending some money on a new capture device and selling this one off. I only bought the Elgato card 3 or 4 months ago, so I’d like to find a way to make it work. I’ve thought about digging up old hardware to run it as a separate streaming PC on bare metal, too.

I’d like to see how far I can take the PCI passthrough idea before I explore other possibilities.

Currently you would have a real problem getting one anyway. Webcams and streaming stuff are sold out everywhere for obvious reasons. But yeah, try to make it work in software is cheaper of course.

That’s a good point that I hadn’t considered… Worst case scenario, I’m sure I could put together some hardware and an old case for running it as a separate streaming machine on bare metal.

Many Elgato devices are “pass-through” - they don’t really do anything to the signal besides electrical connection conversion, and appear as a UVC (USB Video Class) device, similar to most class-compliant audio devices, game controllers, etc. So as long as there is software that can handle the data stream coming from UVC, it’s fine.

The HD60 Pro is not pass-through, it does H.264 hardware compression to the incoming video stream, and as a PCIe card, it needs a driver to connect to the OS. (The HD60 external device also does this – important to note that the HD60 S does not, even though they look similar.)

For UVC devices, they rely on the OS / software and the computer’s CPU / GPU to do any hardware compression, which is why they typically top out at 1080p; only newer CPUs or APUs would have the speed or hardware to handle 4K video.

(Source: just bought a used HD60 to connect a camcorder as a makeshift webcam for home office use… :slight_smile:

2 Likes

Thanks for the response. That is all interesting info that I’ll keep in mind, but I think you misunderstand what I’m talking about. I want to pass the PCIe card into a Windows virtual machine using IOMMU. That’s what I mean by passthrough, not the AV signal.

20 seconds, I’d give you, depending on hardware and memory usage, but 1 minute? no way.

This is a problem.

Let’s try to fix the other two issues first.


I’ll need a bit of information.

iommu.sh output, please.

lspci -knn output as well.

PC specs, including motherboard model and firmware version.


Are you passing through any other PCI devices?

Output of iommu.sh:

IOMMU Group 0 00:00.0 Host bridge [0600]: Intel Corporation 8th Gen Core Processor Host Bridge/DRAM Registers [8086:3ec2] (rev 07)
IOMMU Group 10 00:1d.0 PCI bridge [0604]: Intel Corporation 200 Series PCH PCI Express Root Port #9 [8086:a298] (rev f0)
IOMMU Group 11 00:1f.0 ISA bridge [0601]: Intel Corporation Z370 Chipset LPC/eSPI Controller [8086:a2c9]
IOMMU Group 11 00:1f.2 Memory controller [0580]: Intel Corporation 200 Series/Z370 Chipset Family Power Management Controller [8086:a2a1]
IOMMU Group 11 00:1f.3 Audio device [0403]: Intel Corporation 200 Series PCH HD Audio [8086:a2f0]
IOMMU Group 11 00:1f.4 SMBus [0c05]: Intel Corporation 200 Series/Z370 Chipset Family SMBus Controller [8086:a2a3]
IOMMU Group 12 00:1f.6 Ethernet controller [0200]: Intel Corporation Ethernet Connection (2) I219-V [8086:15b8]
IOMMU Group 13 02:00.0 Non-Volatile memory controller [0108]: Samsung Electronics Co Ltd NVMe SSD Controller SM981/PM981/PM983 [144d:a808]
IOMMU Group 14 04:00.0 USB controller [0c03]: ASMedia Technology Inc. ASM2142 USB 3.1 Host Controller [1b21:2142]
IOMMU Group 15 05:00.0 Multimedia controller [0480]: YUAN High-Tech Development Co., Ltd. Device [12ab:0380]
IOMMU Group 1 00:01.0 PCI bridge [0604]: Intel Corporation Xeon E3-1200 v5/E3-1500 v5/6th Gen Core Processor PCIe Controller (x16) [8086:1901] (rev 07)
IOMMU Group 1 01:00.0 VGA compatible controller [0300]: NVIDIA Corporation GM204 [GeForce GTX 980] [10de:13c0] (rev a1)
IOMMU Group 1 01:00.1 Audio device [0403]: NVIDIA Corporation GM204 High Definition Audio Controller [10de:0fbb] (rev a1)
IOMMU Group 2 00:02.0 Display controller [0380]: Intel Corporation UHD Graphics 630 (Desktop) [8086:3e92]
IOMMU Group 3 00:14.0 USB controller [0c03]: Intel Corporation 200 Series/Z370 Chipset Family USB 3.0 xHCI Controller [8086:a2af]
IOMMU Group 4 00:16.0 Communication controller [0780]: Intel Corporation 200 Series PCH CSME HECI #1 [8086:a2ba]
IOMMU Group 5 00:17.0 RAID bus controller [0104]: Intel Corporation SATA Controller [RAID mode] [8086:2822]
IOMMU Group 6 00:1b.0 PCI bridge [0604]: Intel Corporation 200 Series PCH PCI Express Root Port #17 [8086:a2e7] (rev f0)
IOMMU Group 7 00:1c.0 PCI bridge [0604]: Intel Corporation 200 Series PCH PCI Express Root Port #1 [8086:a290] (rev f0)
IOMMU Group 8 00:1c.4 PCI bridge [0604]: Intel Corporation 200 Series PCH PCI Express Root Port #5 [8086:a294] (rev f0)
IOMMU Group 9 00:1c.7 PCI bridge [0604]: Intel Corporation 200 Series PCH PCI Express Root Port #8 [8086:a297] (rev f0)

Output of lspci -knn:

00:00.0 Host bridge [0600]: Intel Corporation 8th Gen Core Processor Host Bridge/DRAM Registers [8086:3ec2] (rev 07)
	Subsystem: ASUSTeK Computer Inc. PRIME H310M-D [1043:8694]
	Kernel driver in use: skl_uncore
	Kernel modules: ie31200_edac
00:01.0 PCI bridge [0604]: Intel Corporation Xeon E3-1200 v5/E3-1500 v5/6th Gen Core Processor PCIe Controller (x16) [8086:1901] (rev 07)
	Kernel driver in use: pcieport
00:02.0 Display controller [0380]: Intel Corporation UHD Graphics 630 (Desktop) [8086:3e92]
	DeviceName:  Onboard IGD
	Subsystem: ASUSTeK Computer Inc. UHD Graphics 630 (Desktop) [1043:8694]
	Kernel driver in use: i915
	Kernel modules: i915
00:14.0 USB controller [0c03]: Intel Corporation 200 Series/Z370 Chipset Family USB 3.0 xHCI Controller [8086:a2af]
	Subsystem: ASUSTeK Computer Inc. 200 Series/Z370 Chipset Family USB 3.0 xHCI Controller [1043:8694]
	Kernel driver in use: xhci_hcd
00:16.0 Communication controller [0780]: Intel Corporation 200 Series PCH CSME HECI #1 [8086:a2ba]
	Subsystem: ASUSTeK Computer Inc. 200 Series PCH CSME HECI [1043:8694]
	Kernel driver in use: mei_me
	Kernel modules: mei_me
00:17.0 RAID bus controller [0104]: Intel Corporation SATA Controller [RAID mode] [8086:2822]
	Subsystem: ASUSTeK Computer Inc. SATA Controller [RAID mode] [1043:8694]
	Kernel driver in use: ahci
	Kernel modules: ahci
00:1b.0 PCI bridge [0604]: Intel Corporation 200 Series PCH PCI Express Root Port #17 [8086:a2e7] (rev f0)
	Kernel driver in use: pcieport
00:1c.0 PCI bridge [0604]: Intel Corporation 200 Series PCH PCI Express Root Port #1 [8086:a290] (rev f0)
	Kernel driver in use: pcieport
00:1c.4 PCI bridge [0604]: Intel Corporation 200 Series PCH PCI Express Root Port #5 [8086:a294] (rev f0)
	Kernel driver in use: pcieport
00:1c.7 PCI bridge [0604]: Intel Corporation 200 Series PCH PCI Express Root Port #8 [8086:a297] (rev f0)
	Kernel driver in use: pcieport
00:1d.0 PCI bridge [0604]: Intel Corporation 200 Series PCH PCI Express Root Port #9 [8086:a298] (rev f0)
	Kernel driver in use: pcieport
00:1f.0 ISA bridge [0601]: Intel Corporation Z370 Chipset LPC/eSPI Controller [8086:a2c9]
	Subsystem: ASUSTeK Computer Inc. Z370 Chipset LPC/eSPI Controller [1043:8694]
00:1f.2 Memory controller [0580]: Intel Corporation 200 Series/Z370 Chipset Family Power Management Controller [8086:a2a1]
	Subsystem: ASUSTeK Computer Inc. 200 Series/Z370 Chipset Family Power Management Controller [1043:8694]
00:1f.3 Audio device [0403]: Intel Corporation 200 Series PCH HD Audio [8086:a2f0]
	Subsystem: ASUSTeK Computer Inc. 200 Series PCH HD Audio [1043:8724]
	Kernel driver in use: snd_hda_intel
	Kernel modules: snd_hda_intel
00:1f.4 SMBus [0c05]: Intel Corporation 200 Series/Z370 Chipset Family SMBus Controller [8086:a2a3]
	Subsystem: ASUSTeK Computer Inc. 200 Series/Z370 Chipset Family SMBus Controller [1043:8694]
	Kernel driver in use: i801_smbus
	Kernel modules: i2c_i801
00:1f.6 Ethernet controller [0200]: Intel Corporation Ethernet Connection (2) I219-V [8086:15b8]
	Subsystem: ASUSTeK Computer Inc. Ethernet Connection (2) I219-V [1043:8672]
	Kernel driver in use: e1000e
	Kernel modules: e1000e
01:00.0 VGA compatible controller [0300]: NVIDIA Corporation GM204 [GeForce GTX 980] [10de:13c0] (rev a1)
	Subsystem: NVIDIA Corporation GM204 [GeForce GTX 980] [10de:1116]
	Kernel driver in use: nvidia
	Kernel modules: nvidiafb, nouveau, nvidia_drm, nvidia
01:00.1 Audio device [0403]: NVIDIA Corporation GM204 High Definition Audio Controller [10de:0fbb] (rev a1)
	Subsystem: NVIDIA Corporation GM204 High Definition Audio Controller [10de:1116]
	Kernel driver in use: snd_hda_intel
	Kernel modules: snd_hda_intel
02:00.0 Non-Volatile memory controller [0108]: Samsung Electronics Co Ltd NVMe SSD Controller SM981/PM981/PM983 [144d:a808]
	Subsystem: Samsung Electronics Co Ltd NVMe SSD Controller SM981/PM981 [144d:a801]
	Kernel driver in use: nvme
	Kernel modules: nvme
04:00.0 USB controller [0c03]: ASMedia Technology Inc. ASM2142 USB 3.1 Host Controller [1b21:2142]
	Subsystem: ASUSTeK Computer Inc. ASM2142 USB 3.1 Host Controller [1043:8756]
	Kernel driver in use: xhci_hcd
05:00.0 Multimedia controller [0480]: YUAN High-Tech Development Co., Ltd. Device [12ab:0380]
	Subsystem: Device [1cfa:0006]
	Kernel driver in use: vfio-pci

PC Specs:

  • Intel Core i5 8600k
  • 16 GB DDR4 (I don’t know the specific DIMMs)
  • Samsung 970 Evo 500GB NVMe drive
  • ASUS PRIME Z370-A Motherboard (Version 1802)
  • nVIDIA GeForce GTX 980 (Gigabyte GV-N980D5-4GD-B)

Output from dmesg | grep vfio-pci (found this in Googling similar issues):

[    0.000000] Command line: initrd=\EFI\Pop_OS-3dbe593c-148b-49e3-a762-22a9cf82d27e\initrd.img root=UUID=3dbe593c-148b-49e3-a762-22a9cf82d27e ro quiet loglevel=0 systemd.show_status=false intel_iommu=on vfio-pci.disable_idle_d3=1 splash
[    0.043216] Kernel command line: initrd=\EFI\Pop_OS-3dbe593c-148b-49e3-a762-22a9cf82d27e\initrd.img root=UUID=3dbe593c-148b-49e3-a762-22a9cf82d27e ro quiet loglevel=0 systemd.show_status=false intel_iommu=on vfio-pci.disable_idle_d3=1 splash
[  146.498791] vfio-pci 0000:05:00.0: enabling device (0000 -> 0002)
[  148.151887] vfio-pci 0000:05:00.0: not ready 1023ms after PM D3->D0; waiting
[  149.244311] vfio-pci 0000:05:00.0: not ready 2047ms after PM D3->D0; waiting
[  151.359228] vfio-pci 0000:05:00.0: not ready 4095ms after PM D3->D0; waiting
[  155.709352] vfio-pci 0000:05:00.0: not ready 8191ms after PM D3->D0; waiting
[  164.158996] vfio-pci 0000:05:00.0: not ready 16383ms after PM D3->D0; waiting
[  181.569493] vfio-pci 0000:05:00.0: not ready 32767ms after PM D3->D0; waiting
[  216.380406] vfio-pci 0000:05:00.0: not ready 65535ms after PM D3->D0; giving up
1 Like

Well then.

I suspect that’s where your slow startup comes up.

It can’t take it out of PCIe sleep mode.

Which would also explain why the windows VM can’t get any signal.

Had that same idea. In another thread @FurryJackman (?) mentioned that passthrough does not work with these cards though for some reason. Maybe he can elaborate on that.

I think (might be wrong) the Elgato capture thing relies on a GPU being present for encoding, because the cards do not have a hardware encoder.

Most likely they represent themselves as a generic USB Video Class.

3 Likes

It’s showing up as a PCI device which is not a USB controller. I doubt it.

He was talking about the Camlink though, which is USB, not PCI right :thinking:

Ah, well then that’s off topic.

1 Like

I’m very new to this, so please bear with me, but isn’t the “disable_idle_d3” option supposed to prevent it from going into D3 state? Is that option improperly applied? How can I verify that it is being applied to the card in Linux?

Yeah, see?!? That thing. … I totally nailed it! :stuck_out_tongue: :+1:


@trippsc2 If the fix “happens in hardware” also look at Magewell and AVIO. I have a Magewell 1080p HDMI to USB dongle and it works great everywhere. It also does more than the Elgato sticks so performance impact is gonna be minimal. AVIO was one of the first companies capable of 4k capture (at least for consumers). IIRC those also run on linux.

I would imagine so, let me have a look at the kernel docs.

I was thinking the PCI device, not the USB. :confused:

1 Like

The HD60 Pro is a PCI device.

All good, was just a misunderstanding. :wink:

1 Like

Looks like your disable_idle_d3 is properly defined.

Not really sure what’s going on.

1 Like