30bit color depth (10 bits per color) breaks everything

Having a deep color monitor, (4K monitors going forward are going to have to support 10bit color if they advertise HDR10) I wanted to test out how deep color would work with Xserver and NVIDIA drivers on Linux…

Turns out, if you set bit depth to 10bits per channel, (30bits in xorg.conf) everything breaks.

Cinnamon is broken at 30bit color.

Steam is broken at 30bit color.

All other KDE framework applications have to do an emergency fallback rendering mode in debug output.

This has been my experience on Cinnamon, but if your experience is better, I would love to know that on other DEs. Looks like this is a major fail being unable to cope with 10bit color.

I tried 36bit (12bits per color) and it didn’t work on my particular monitor.

Edit:

Issue with Steam. Still not solved in 2019.

Issue with Cinnamon also reported. Also not fixed in 2019.

https://bbs.archlinux.org/viewtopic.php?id=212467

Gentoo thread:

https://forums.gentoo.org/viewtopic-t-942736.html

Fedora threads:

https://ask.fedoraproject.org/en/question/119554/black-screen-defaultdepth-30-fedora-27-kde/

https://ask.fedoraproject.org/en/question/78819/screen-30-bit-colour-depth/

Nobody has taken initiative to fix 10bits per channel color on Linux. BIG blocker to HDR and WCG on Linux. 10bit monitors will become more common as more monitors “support” HDR10.

3 Likes

It’s probably the problem not that no one has taken the initiative but that none of the developers use or have those monitors.

3 Likes

Well, HDR10 is starting to become cheaper in price. @wendell’s recent BenQ EL2870 supports HDR10 signals.

Just set DefaultDepth 30 in xorg.conf and you will see ALL THE PROBLEMS.

I agree with @Eden on this, like the other topic of HDR in linux, well linux is all ready a tiny market, HDR is a tiny part of the entire market so HRD capable hardware with linux has to be a hand full of people worldwide. And with it being a lot of free time and comunity work no one is going to be working on it for pretty much just themselves to use.

Unfortunately this is just no where near common enough to get supoorted.

But people like me and @wendell would like to still play with it, just like playing with Looking Glass.

On the Windows front, with so many HDR TVs on the market right now, Windows properly supports HDR, 10 and 12bit and etc…

So this would be another “in 20 years” issue that will never be addressed.

Oh I am not doubting that people want it, it is certailyl a good thing to have just noting that with so few people in a very specific market with an even more specific set up it is unlikely you will see it fixed/added anytime soon, unless you do the work yourself.

I mean there are thing that would be classed as critically missing from linux still that have not been fixed or added in years. So it is a bit wishfulthinking that this one would be.

This is another unfortunate case of windows existing and owning the market.

Do you work on wayland and xorg? If so you should make the changes.

It’s not the people aren’t interested it’s that the people interested aren’t the developers.

The developers don’t have that tech, they don’t use it, so they haven’t made anything for it yet.

Well, to some people it’s another blocker to Linux as a desktop platform. But the fix isn’t easy, as Eden said, devs don’t need HDR video and devs aren’t going to code HDR games for Linux.

The big blocker I would see is the Digital Signage industry switching to Windows Embedded/Windows 10 because they can’t properly utilize HDR with a Linux playout box. Digital Projection with interactive elements pre-movie at a movie theater? Windows seems more sensible.

Devs will work for money so if someone wanted to pay them they’d do it

I would find that to be a unRAID situation, where if someone really wants HDR video playback on Linux, they’d go to proprietary developers to do so. Most playout systems for digital signage are based off of Linux, but use proprietary means, so this would make sense.

It’s just too bad for everyone else though. There’s nothing we can do about that.

You can pay the developers that work on the open source projects, no need to hire developers to make proprietary tools.

1 Like

Right, but for cases like digital signage companies, people will pay to get the job done and keep it proprietary.

Or they go to the path of least resistance. Which is Windows 10.

If digital signage starts using HDR that would be astounding. Most digital signage I see is super low quality panels just to put words in lights.

Yeah, that’s what I’m hoping too. 4K digital signage is already commonplace, but not 4K HDR… YET.

Heck, Japan has 4K HDR QVC (YES, the QVC shopping channel) now for goodness sakes…

Has HDR support improved at all? I’m trying to migrate to Linux on my primary PC finally. In Windows, I was able to set color to 12-bit in the NVIDIA Control Panel by reducing the refresh rate to 30Hz. In NVIDIA X Server Settings, even when set to 30Hz, bit depth is limited to 8 bpc. Any tricks to get this working?

Nope. It’s still broken.

How about the fact that in Windows, 3840x2160 displayed perfectly fine on my LG OLEDB7A, but in Linux, I have to set Underscan to 49 or the edges of the screen are cut off? Is there a fix for this, or do I need to refresh my memory on mucking about with an xorg configuration file to make this seemingly unnecessary change permanent?

Use a DP to HDMI active adapter. That might eliminate the underscan issue.

what has 10 bit even todo with HDR.

on windows we have 16 bpp support since windows 7 (kinda broken not totally).
every not totally trash TV supports 12 bit input since i don’t know HDMI 1.3. it’s a totally basic thing.

why is it even assumed that you need 10 bit for either HDR or WCG. it’s not that big of a deal the difference for video playback is noise level and we talk about dither noise level. it’s sadly not rare that it is actually better to send 8 bit dithered instead of 10 bit because it’s very hard with bad processing to add banding into an 8 bit signal with dither but trivial with on 10 bit input signal when processing is only done at again with 10 bit at the end device.

while it is bad that it is broken is not the reason why HDR is so bad supported.

you would be shocked to hear how broken windows HDR is or was.
we currently have 3 major APIs the general windows API which well is windows… the nvidia api with is pretty great in theory by support pretty much everything but had some dumb bugs like overwriting meta data and sending none sense but even d3d9 works with it and the currently extremely buggy amd AGS api which works fine with older GPU and is utterly broken with new cards like navi amd it has dumb limitations.

so what’s the most important thing here GPU driver support and does linux have HDR driver support? so why do you even care about bit deep at this point?

In short, I’ve been able to correctly playback content with 10-bit color and HDR within Windows and would like to do the same in Linux.

1 Like