4K + HDR Capture Options

Hi there,

so in the past few weeks I’ve looked more into streaming and even though I’m not quite sure yet whether I will go through with a purchase, I want to look at options.

First off I’m running OBS on Linux (Fedora), and even though I could go back to Windows any time, I just don’t feel like it. I know that limits options significantly, but I also sort of want to see if it’s even possible.

Currently I own a PS4 Pro, and I do like using HDR while playing, having a 4K OLED TV that can actually utilise it. I could disable it, but if it’s there I also want to be able to use it. With the PS5 and Xbox Series X coming out (eventually), HDR will remain a hot topic so it’s not like it’s going away. Since I’m only capturing consoles I don’t really care about 60 FPS capture because they typically only output at 30 fps anyway (and lets face it if it wasn’t on Twitch’s player, most people wouldn’t even notice the frame-rate). I also only really need 1080p capture because I (probably) won’t be recording, so 4K is useless for the time being.

Now, on to the actual issue. From what I can tell playing in HDR and capturing SDR is essentially a hot mess. I know OBS doesn’t support an actual 10 bit pipeline, so we’re kinda stuck with a tone-mapped SDR signal or using LUTs of our own.

@eposvox just recently did a video on the topic (Gaming in HDR and streaming in SDR sucks), although I was looking into this a few weeks before, so it’s a happy coincidence.

Now from what I understand the HDR to SDR tone-mapping is essentially doing the same thing. It applies a LUT to the 10 bit signal that limits the colours to however many fit in an 8 bit signal. I think I got that right, but I’m not quite certain on it. So the first thing I’m wondering is if there’s any downside to doing it using a LUT in OBS. Considering that OBS only takes an 8 bit input that would also mean that there is already loss before the LUT is applied, whether that’s visible is what I’m wondering. On the other hand when doing hardware-tone-mapping you’re essentially stuck with the vendor’s LUT.

With that uncertainty in my mind I started looking into capture cards and how they deal with HDR signals. And that seems to be even more of a hot mess from what I can tell, because it’s either not documented, or only works in certain scenarios, or only on Windows, or… pick whatever. I’ve been looking into a few cards so far (see spoiler below).

Possible Options

Usable Options

Blackmagic Intensity Pro 4K

  • Pros
    • native Linux support
    • 4K30 capture
    • 10 bit input and pass-through
    • PCIe
  • Cons
    • from what I can tell no tone-mapper (or maybe it has, and it’s not documented)
  • Unclear
    • no 4K60 pass-through (?) - it doesn’t say anything about it in the specs, so what happens when the console outputs at 60 Hz? Will it actually drop every second frame, will it even capture anything?

AVerMedia Live Gamer Ultra (GC553)

  • Pros
    • UVC, so works on any device (theoretically)
    • 4K60 pass-through
    • 4K30 capture (unclear what happens with a 4K60 input signal)
    • 10 bit input and pass-through
  • Cons
    • USB, which seems to cause issues with dropped frames etc
    • no Linux vendor support (i.e. you have an issue, you figure it out)
  • Unclear
    • integrated tone-mapping (as per @eposvoxreview), although he doesn’t specify whether that’s in RECentral or hardware/firmware based

Corsair Elgato 4K60 S+

  • Pros
    • UVC
    • 4K60 pass-through
    • 4K60 capture
    • 10 bit input and pass-through
  • Cons
    • no tone-mapping (as per @eposvoxreview)
    • USB, which seems to cause issues with dropped frames etc
    • no Linux vendor support (i.e. you have an issue, you figure it out)

Corsair Elgato HD 60 S+

  • Pros
    • UVC
    • 4K60 pass-through
    • 4K30 capture (only with 4K30 input, 4K60 input does not get
    • 10 bit input and pass-through
  • Cons
    • USB, which seems to cause issues with dropped frames etc
    • no Linux vendor support (i.e. you have an issue, you figure it out)
  • Unclear
    • integrated tone-mapping (as per @eposvoxreview), although unclear whether in software or hardware

Impossible / Theoretical options

Corsair Elgato 4K60 Pro MK.2

  • Pros
  • Cons
    • No Linux Support from what I can tell. No (official) driver, unofficial driver project was dropped, pass-through to a Windows VM does not work (as explained by @FurryJackman here)
    • No RGB LEDs :frowning: /s

AVerMedia Live Gamer 4K (GC573)

  • Pros
    • 4K60 pass-through
    • 4K60 capture
    • 10 bit input and pass-through
    • integrated Tone-Mapping
  • Cons
    • No Linux Support from what I can tell. No (official) driver, from what I understand pass-through to a Windows VM does not work (as explained by @FurryJackman here)
    • with RGB LED :+1: /s

So with the notes in the spoiler above I’m pretty much looking at the Blackmagic Card (where I don’t know if a 4K60 Input works), or one of the USB Options.

I know there’s also Magewell and AJA, but the Cons on those pretty much come down to $$$ (and also availability). There’s also AVerMedia Professional cards (e.g. CL511HN), but I can’t find them to buy anywhere (my understanding is that they’re OEM/ODM only).

Soooo… essentially, what is the least headache for HDR (with/without SDR tone-mapping) on Linux as of now?

Decklink Mini Recorder 4K + Atomos Ninja V. Have a Splitter before the Ninja V so you have the passthrough going to your destination monitor, then downconvert and tonemap in the Ninja V. This way you don’t have to worry about processing delay in the Ninja V for passthrough to the destination monitor. You can also record in DNxHR if you have an SSD, and that is usable in the free version of Davinci Resolve (no H264 issues there)

Even if you don’t plan to record, the best hardware tonemapper for the price is still the Ninja V. Nothing else even comes close at that price point.

Splitter choices are where you absolutely can’t cheap out. Find one that is 100% reliable (not a Amazon one) and stick with that one. The Gefen is the only one that Multiboxology trusts:

https://www.gefen.com/product/4k-ultra-hd-600-mhz-12-splitter-w-hdr-EXT-UHD600-12

The Intensity Pro 4K converts HDMI to DisplayPort before capturing, and converts DP back to HDMI for passthrough, so it creates a whole host of problems, and is limited to 4K 30p. The Intensity Pro also doesn’t tonemap. The Decklink uses FPGA logic to be a HDMI receiver, and so adapts better to signal loss. It doesn’t have passthrough though, but a splitter is logically better for this purpose.

2 Likes

Hey, thanks for chiming in.
Soo, I just looked those devices up and they are some mighty price tags right there :slight_smile:
It’s nice to see the tech being available, but I was more looking at… uhm… “affordable” options (hence why I mentioned Magewell and AJA in the bottom paragraph :wink: )

OK, did not know that. That seems like a weird way to go about it when the Decklinks are just using straight HDMI…

And I assume that means throwing a 4K60 signal at it won’t give you any capture at all?

Since you mentioned the Decklinks, I was also looking into that but that splitting was pretty much the issue I had with them. There are a lot of cheap(-ish) splitters that supposedly support 4K60HDR (example), so I’m wondering what sets those pricey ones apart (other then business use).
I was also debating getting an AV Receiver at some point… which would kinda solve that splitting problem too, since most mid range AVR come with Dual Output… not sure how well that works of course.

The AV receiver might mandate HDCP 2.2. The Gefen splitter just follows the source, so you can turn off HDCP in the PS4 Pro and it will work with the Ninja V.

Do NOT get a cheap splitter. They lack proper power phases and overheat, and is often only configured from leaked documents or reverse engineering, not the true NDA specs for the chips, because HDMI Licensing requires NDAs to get chip datasheets. The biggest difference is EDID management, which none of the cheaper splitters have. (if they do have it, it’s incorrect. HDMI 2.0 HDCP 2.2 EDID emulators below $200 are all 1080p EDID emulators with 4K EDID data pre-programmed. That just simply doesn’t work.)

The Decklink might capture at half rate or not at all. Haven’t tried yet on a console that forces 4K 60p.

That probably depends on the model, but from what I’ve seen they just carry over whatever the input had.

In what manner is that important though? Don’t most of them just pass on the EDID of the target output?

Nope, some splitters switch which EDID is selected depending on the last connected device. That’s “dumb” EDID switching. The smarter ones only passthrough the EDID for a single port. The reason why the display disconnects when you connect your capture card is because the splitter “stupidly” switches over the EDID. This is why a EDID Emulator is important, or a splitter that only reads the EDID from a single port, rather than switches between the two ports. The Gefen has EDID Management, which is a step above EDID emulating. You can use a USB port to upload custom EDIDs.

I assure you that AV Receivers can and will add their own HDCP chain because most have a OSD or GUI over HDMI to configure it. If all your devices are HDCP compliant, it’s fine. But if it isn’t, your capture will be blank, and your Decklink could crash your system because the HDCP bit is flipped and ALL input is disabled until a full power cycle.

1 Like

Mh, I kinda thought most would do that but… oh well.
But theoretically this should be enough for casual use then, no? Mind you this is not for a big production value where money is not an issue, just for casual use every now and then (not even daily use either).
An AVR would essentially need testing I guess… or asking their customer support how it handles it. Most mid-range AVRs also have a passthrough mode which doesn’t do any signal processing, so that should also leave HDCP alone, correct?

Going back to the capture though… since it’s uncertain whether any of the Blackmagic products will actually work with a 4K60 input the point about splitters is kind of moot (which is a bummer, because I’d rather pay for professional grade equipment then the gamer-tax). I just find it odd (and/or frustrating) that the professional grade equipment seems to be worse in some aspects (from a layman’s perspective) then some (all?) of the “gamer” products.

So are there any other options then biting the bullet and getting a USB solution until something else becomes available?

The conversion and tonemapping in gaming products give you far less freedom to tune the “lookup table” and process it further. Most of the gaming products are configured on Windows with Elgato or Avermedia software. With the Ninja V, you don’t need that software.

Firmware configuration with passthrough is also not guaranteed. USB solutions also are limited to 60fps NON DROP FRAME. In the Americas, 60fps is actually 59.94fps. NONE of the USB solutions capture this properly, because the UVC standard was designed with only NON DROP FRAME framerates in mind.

You will put less strain on OBS if you process it with the Ninja V, and the Blackmagic Decklink is the cheapest way to properly capture 1080p59.94 without dropping frames or improper frame pacing.

EposVox doesn’t explore the drop frame issue on UVC devices. I did, and I guarantee you ALL UVC based devices don’t work good with drop frame framerates. Everything 60fps in North America for broadcast is actually 59.94fps.

That’s what I figured, unfortunately I don’t have 600$ lying around to drop on a product I’d hardly ever use, so as much as I’d like to get something like it, I just can’t justify the cost.

That’s what I read in a few posts of yours and why I added USB to the Cons lists :wink:

Well for one thing I’m in Europe, but for another this isn’t about broadcasting. Don’t the consoles output 60 FPS regardless of region? From a console capture perspective the drop frames shouldn’t matter in this case, right? They would just duplicate a frame if they can’t render one in time, no? Or are they also operating on the 59,94 ?

Most likely it’s 59.94 for consoles. Even in Europe. Digital Foundry uses European consoles and they are also 59.94.

hhmm… well that sucks :confused:
Wondering why the USB capture cards don’t adjust to that and just duplicate a frame every so often…

But either way, I was never clear on what actually makes this so bad? Does the signal drop out or what’s going on then?

Frame pacing gets misaligned, so you get nasty motion jitter every once in a while. If the frame rates don’t match between the signal, the scaler, and the capture framerate, you get discontinuity.

Something common as well is sometimes people using Final Cut Pro 7 often just drop in 23.976 video in a 29.97 timeline, and it duplicates frames every now and then causing jittery motion which is extremely jarring. for 59.94 to 60.00, the jitter is much higher frequency and equally disorientating in high motion when done wrong. (Linux V4L2 and UVC does this VERY wrong)

I’ve been trying to push for EposVox to test this frame pacing issue between Windows and Linux with UVC devices and the TestUFO frame skipping check, but that’s gone on deaf ears. You have to test this over about an hour or more of captured footage, because if you only do 5 minutes, it’s easy to be mislead.

Hm… that’s what I thought/expected. I’m essentially just wondering how bad it actually is and if it would still be worth it, even if not optimal, until a “proper” solution comes around.

I’m just wondering why Blackmagic doesn’t seem to have anything for this use case. “Gaming” can’t be the only thing with that issue, right? And even if it were, they are trying to appeal to the gaming crowd already anyway (see the 60 fps marketing material). Just… odd.

AJA does have HDR capable capture now, but there’s a really big catch with AJA cards: There’s planned obsolescence built in as in you get completely locked out of using the card if your software or driver is too new. (not it’s unsupported, it’s actively locked out) Avoid AJA for purely this reason.

Blackmagic have a 4 HDMI 2.0 input card, but it doesn’t have loop out, and it requires x8 PCI-E 3.0 lanes and an extra 6-pin. It also has no tonemapping.

There’s a 12G-CROSS HDMI to SDI converter, but even that can’t upload LUTs or Tonemap.

I strongly suggest against the Blackmagic Video Assist recorders because they downconvert for preview down to 30p, and the loop through latency is 5+ frames.

The Ninja V’s only downside is lack of proper SDI functionality, but for casual gaming with 4K in mind, it’s perfect.

So it essentially comes down to the current situation being a pain in the ass :frowning:

Except for the price, unfortunately :frowning:

It’s $100 off right now according to Atomos. And it’s the same level of hardware that Digital Foundry uses.

Even with the discount it’s ~600€ tho :confused:

So… one thing I sort of forgot and/or mixed up and just remembered…

Since I’m not really looking at 4K capture, the 30 FPS limit in 4K isn’t an issue for me.
Do you happen to know what the Intensity does when it gets a 4K60 (or 4K59.94) signal, but it’s set it to capture 1080p60(/59,94)?

Essentially my question is, does that apply only to the capture, or also the pass-through?

Same for the Decklinks I guess.

Nope, the Intensity does not have enough FPGA logic to real time downconvert.

It applies 30p at 4K for the full image processing pipeline. It does NOT passthrough, especially on the Intensity.

So I’d have to input actual 1080p60 for 1080p60 capture, or 4K30 for 4K30 capture, as I understand. It’s not a “real” pass-through as I understand then, i.e. no direct connection between input and output.

Well that sucks :frowning:

Would be interesting to know how the Decklinks handle 4K60 input… i.e. if they just drop every second frame, or if they can still capture 1080p60.

Unfortunately the manual is for all Decklink cards amd I don’t really know what to look for :confused: