Having a deep color monitor, (4K monitors going forward are going to have to support 10bit color if they advertise HDR10) I wanted to test out how deep color would work with Xserver and NVIDIA drivers on Linux…
Turns out, if you set bit depth to 10bits per channel, (30bits in xorg.conf) everything breaks.
Cinnamon is broken at 30bit color.
Steam is broken at 30bit color.
All other KDE framework applications have to do an emergency fallback rendering mode in debug output.
This has been my experience on Cinnamon, but if your experience is better, I would love to know that on other DEs. Looks like this is a major fail being unable to cope with 10bit color.
I tried 36bit (12bits per color) and it didn’t work on my particular monitor.
Edit:
Issue with Steam. Still not solved in 2019.
Issue with Cinnamon also reported. Also not fixed in 2019.
Nobody has taken initiative to fix 10bits per channel color on Linux. BIG blocker to HDR and WCG on Linux. 10bit monitors will become more common as more monitors “support” HDR10.
I agree with @Eden on this, like the other topic of HDR in linux, well linux is all ready a tiny market, HDR is a tiny part of the entire market so HRD capable hardware with linux has to be a hand full of people worldwide. And with it being a lot of free time and comunity work no one is going to be working on it for pretty much just themselves to use.
Unfortunately this is just no where near common enough to get supoorted.
Oh I am not doubting that people want it, it is certailyl a good thing to have just noting that with so few people in a very specific market with an even more specific set up it is unlikely you will see it fixed/added anytime soon, unless you do the work yourself.
I mean there are thing that would be classed as critically missing from linux still that have not been fixed or added in years. So it is a bit wishfulthinking that this one would be.
This is another unfortunate case of windows existing and owning the market.
Well, to some people it’s another blocker to Linux as a desktop platform. But the fix isn’t easy, as Eden said, devs don’t need HDR video and devs aren’t going to code HDR games for Linux.
The big blocker I would see is the Digital Signage industry switching to Windows Embedded/Windows 10 because they can’t properly utilize HDR with a Linux playout box. Digital Projection with interactive elements pre-movie at a movie theater? Windows seems more sensible.
I would find that to be a unRAID situation, where if someone really wants HDR video playback on Linux, they’d go to proprietary developers to do so. Most playout systems for digital signage are based off of Linux, but use proprietary means, so this would make sense.
It’s just too bad for everyone else though. There’s nothing we can do about that.
Has HDR support improved at all? I’m trying to migrate to Linux on my primary PC finally. In Windows, I was able to set color to 12-bit in the NVIDIA Control Panel by reducing the refresh rate to 30Hz. In NVIDIA X Server Settings, even when set to 30Hz, bit depth is limited to 8 bpc. Any tricks to get this working?
How about the fact that in Windows, 3840x2160 displayed perfectly fine on my LG OLEDB7A, but in Linux, I have to set Underscan to 49 or the edges of the screen are cut off? Is there a fix for this, or do I need to refresh my memory on mucking about with an xorg configuration file to make this seemingly unnecessary change permanent?
on windows we have 16 bpp support since windows 7 (kinda broken not totally).
every not totally trash TV supports 12 bit input since i don’t know HDMI 1.3. it’s a totally basic thing.
why is it even assumed that you need 10 bit for either HDR or WCG. it’s not that big of a deal the difference for video playback is noise level and we talk about dither noise level. it’s sadly not rare that it is actually better to send 8 bit dithered instead of 10 bit because it’s very hard with bad processing to add banding into an 8 bit signal with dither but trivial with on 10 bit input signal when processing is only done at again with 10 bit at the end device.
while it is bad that it is broken is not the reason why HDR is so bad supported.
you would be shocked to hear how broken windows HDR is or was.
we currently have 3 major APIs the general windows API which well is windows… the nvidia api with is pretty great in theory by support pretty much everything but had some dumb bugs like overwriting meta data and sending none sense but even d3d9 works with it and the currently extremely buggy amd AGS api which works fine with older GPU and is utterly broken with new cards like navi amd it has dumb limitations.
so what’s the most important thing here GPU driver support and does linux have HDR driver support? so why do you even care about bit deep at this point?