30bit color depth (10 bits per color) breaks everything

There is no support in Xorg or Wayland at all. You have to bypass Xorg or Wayland to have access to custom modes that allow 10 bit color without crashing.

So you cannot run a desktop environment with 10bit color, because it will instantly crash. No effort is being made to fix it because it’s not important.

So no, you cannot do this on Linux right now and for decades to come.

to do the same you first need HDR support in your GPU driver 10 bit is not important not needed at all for TV the AGS from AMD needs it as an input but the GPU output can be what ever.

advanced user on windows avoid 10 bit and up to this day the windows desktop can’t do 10 bit.
10 bit with adobe with provessional GPUs renders on RGBA where the A channel is misused to store 2 bits of R, G and B.
video player and games use surfaces that only support 10 bit on in fullscreen on windows 7 you need FSE win10 has support for 10 bit and “16 bit”(personally never used that WFS) for WFS but it has to be in fullscreen mode or it will fallback to 8 bit.

the GPU bit deep options are completely unrelated to the windows desktop and the surfaces.

you can render let say it correctly present 16 bit on a d3d11 surface and the GPU will output 6 bit to the display doing the dithering on it’s own. (6 bit is only available on AMD and you usually need an gaming monitor)

Bought an active adapter and it didn’t fix the problem. I probably should have mentioned that this only happens in Linux when using the proprietary driver. Nouveau works fine.

edit Also, Kodi displays fine fullscreen without the underscan setting.

It might be down to how the TV interprets the signal. Try to find a “Size” option in the TV and it has to be “full pixel” or 1:1.

Full-screen apps don’t overscan when using the NVIDIA driver. The Nouveau driver doesn’t overscan at all. Windows doesn’t overscan at all. Only the desktop overscans and only in Linux when using the NVIDIA driver. The TV only has an ‘aspect ratio’ setting, which is set to ‘original’.

edit Overscan probably isn’t even the right word to use here. It really seems to come down to Cinnamon rendering the Panel beyond the edges of the screen. Even the Desktop icons appear to be the correct distance from the edge of the screen.

Try KDE, it might have better settings regarding that as it knows the difference between HDMI, DVI and DP.

Okay, so I know I’ve completely hijacked this thread, but I thought I’d give an update anyhow. I remembered that setting the input name and icon to PC on this TV (LGOLEDB7A) automatically disables some of the advanced features and changes some things that aren’t directly shown to the user. Guides on picture/color calibration specifically tell you not to do it. Out of curiosity, I did exactly that and all of a sudden the desktop renders correctly without scaling. After ensuring Deep Color was still enabled for the input and setting the Gamma to ‘High2’ and Sharpness to 0, everything looks good enough to my eye so far (though I haven’t done any real content consumption yet).

PC input labels sometimes send a completely different EDID to the GPU and definitely changes the internal processing pipeline. I have this on my Samsung T240HD.

Isn’t that an HDMI issue, and in some cases, DP?

On the original topic, HDR computer monitor will only becoming mainstream for consumer market at least a year after it’s been a majority standard for GUI monitor in post production for motion picture industry. Pro 10-bit monitors had been a de facto standard for decades before there was any true 10-bit consumer monitor.

Unfortunately, not for Linux devs. They’re unlikely (if ever) to move to those monitors and start developing for it. No one for Xorg or Wayland will even start work on HDR within this decade.

has nothing todo with DP or HDMI the desktop still has to render in 10 bit if it is asked too even for an 8 bit monitor doesn’t matter.
the GPU driver makes it work if it is bit deep subsampling or levels it the GPU drivers work.

BTW. would be nice if you would stop insulting devs and there hardware. i’m pretty sure they have brought a TV in the last 10 years which supports 12 but input (which is pretty much all TVs)…

and what is the OS on new TVs android or something but that has nothing todo with linux right?

@Huhn_Morehuhn I found in the past that SDI never had any problem with EDID, I suspect it’s about protocol/standards.

@FurryJackman My point is without much demand, which correlates to consumer market level of demand & not just small portion from niche business, there wouldn’t be much money available for any devs. Until the motion picture industry (and I forgot to add, gaming) companies actually buy 12-bit GUI monitors regularly —which in turns means they’re using them to produce 10-12 bits of materials for world wide consumptions— the manufacturers won’t have any business interest in ramping up expensive-to-build higher than normal color density computer monitor production; because well, there’s not much demand for it relative to actual demand in the whole markets.

There’s also no demand because creating code doesn’t depend on impossibly high contrast ratio.

Huhn, if you feel like this is insulting, why not do something about it? Investigate why programs break under 10bit and 12bit in Xorg and Wayland and contribute fixes. The only reason I can’t contribute is the code for Xorg and Wayland is above my skill level.

Android also doesn’t use Xorg or Wayland. It uses SurfaceFlinger.

1 Like

That’s true, that’s where the higher-than-normal color density part comes in. But there’s also more to it.

For end user, linux devs included, 8-bit to 10-bit color depth is huge difference perceptually, 10 to 12 not as much. The engineering involved in manufacturing the parts though, from 8 to 10 or 10 to 12, are more than doubling in difficulty, just by the constraint of difference of actual signal bandwidth they need to accomodate & sustain.

This is why manufacturers for pro displays, displays that have luminance consistency across the entire panel, fulfilled high standard of color spaces & other aspects like signal frame rate, consistent color accuracy that is guaranteed in a period of time (years), along with the industrial level interfaces that are needed by their target users, are very expensive. Manufacturers don’t mass produce a product just because they have done R&D on it, there needs to be an actual “let me throw money at you to get that product” demand. As an effect this led to small number of players in that field.

Considering that price range of available pro monitors, depending on their specs (including even 10-bit ones) are $2k to $LOL, and the actual spent capital to come up with that production ready product is obviously more than any single unit price, whatever linux devs do with the software won’t have slightest impact in this earliest stage, unless they start to manufacture 12-bit monitors themselves, or the pro monitors have been produced in enough quantity that the bottom of the barrels can be monetized as consumer level products.

1 Like

if that’s the case do it show us how big that difference is
https://abload.de/img/bit1xpj4a.png
https://abload.de/img/bit238ke9.png

one of them is not even close to 8 bit.

The engineering involved in manufacturing the parts though, from 8 to 10 or 10 to 12, are more than doubling in difficulty, just by the constraint of difference of actual signal bandwidth they need to accomodate & sustain.

parts?
32 bit processing is the number that’s written as tflops on GPUs.
because that’s what they do 32 bit floatpoint math.
and about the signal part this is decades old every user grade GPU with DX 10 can do that every.

and display are so expensive yes?
AOC Q3277PQU
native 10 bit and from 2015…
and again irrelevant to add support for this because you only need a display with 10 bit input support(actually you don’T even need that) which is pretty much every TV in the past 10 year and new displays for 150…

you just have to understand that you need the right tool for the job.
and if you want 10 bit and HDR in linux this is what you need:


just for linux. not a desktop that runs in 10 bit…

1 Like

If you put work into it I’ll donate. I plan to have a 4k curved display sometime within 2 years from now, and if they come out with hdr ready to go I will certainly want to use it.

Also note: nvidia mill newer care if your display tech works. Xetter to look at amd imo.

Monitor hardwares are not theoretical throughput numbers from GPU, sustained luminance for HDR displays ranges from 1000 - 10000nits. That’s just luminance level, not including color accuracy & panel uniformity.

So yes, parts.

Sony BVM CRTs are native 10-bit HD, they’re from 90s. 10-bit consumer level monitor wouldn’t be available before 10-bit pro displays became a mainstream standard for professionals, and pro 10-bit monitors are still expensive even today, pro 12-bit even more so. Therefore 12-bit consumer level monitors are not going to be a common product for reasons I already wrote there.

You need to learn to read.



The engineering involved in manufacturing the parts though, from 8 to 10 or 10 to 12, are more than doubling in difficulty, just by the constraint of difference of actual signal bandwidth they need to accomodate & sustain.

this is what you said what has anything of this todo with PQ and brightness?

10 bit monitors are very common and you can get one for 300 euro easily that’s a freak truth there is nothing special about it it very old tech nothing 12 bit input is a default feature for pretty much every TV produced in the past decade.

what’s wrong with you? what do i have to read about professional displays… why do they even matter for this.
you can get an HDR 10 display for 150 euros like the BenQ EW277HDR the point of this topic is getting this signal in a into such a screen/TV if the panel is native 10 bit or not is irrelevant for this…

and here is the spec you didn’t care to look up about the AOC Q3277PQU:


if this page says it is from 2015 then it is nearly for sure correct:
https://geizhals.eu/aoc-q3277pqu-a1281190.html

1 Like

You do know that input can be downconverted to force processing lesser output, right? you quoted the part of my post where I was talking about why 12-bit bit depth monitor isn’t going to be tackled by linux devs due to lack of actual demand, of which I proceeded to explain further. That word, 12-bit, is color bit-depth, and in any part of the world, listing bit depth spec means capable of displaying actual said color & bandwidth. Meaning 12-bit input - 12-bit output. When your beloved BenQ monitor says it’s 10-bit, it actually stated it can output or displays 10-bit, not more.

So your screenshot of 12 bpc NVidia input to 10-bit monitor as proof of “it very old tech nothing 12 bit input is a default feature for pretty much every TV produced in the past decade” and what i quoted below can only mean either you’re lacking reading comprehension or you’re a troll. HDR10 displays for consumers actually didn’t exist before 2016, so I assume you got “the past decade” from some dark holes.

i know how processing works and i know you can use 10 bit for none HDR.
and i know that “only” panasonic plasma panel actually send 12 it to the panel i would have showed you 10 there but nvidia… but that’s not the point this only part that need 2 more bit per pixel to do that is the panel. it’s not done because there is nothing to gain from.

my TV in that screen is 8 bit (don’t tell anyone but like the majority even highend HDR TVs you have a higher change to get a 10 bit panel when you buy a cheap TV with a 60 HZ panel doing 120 HZ UHD 10 bit is quite some work)… do you really think someone sees the difference between a 8 bit panel and a 10 bit UHD with proper dithering hell it is worse when i send higher bit deep signal and this is not a cheap TV. the TV is none HDR BTW. 12 bit processing is at least 10 years old sorry i don’t have such old TV here anymore to proof that too and they had 10 bit panels back then too. maybe my OLD 1080 TV was 10 bit i will never know.

i even gave you an 6 bit banding test to show you it’s not easy to see the difference since people figured out dithering.
all this doesn’t change that there are a lot of true 10 bit panel the linked one is an old nine professional one others have been discontinued and replaced.

10 bit professional mainstream display are so old i don’t even…
eizo and dell where doing these for like ever compared to a BVM and it’s successors they where and are pretty much free…

You do know that input can be downconverted to force processing lesser output, right? you quoted the part of my post where I was talking about why 12-bit bit depth monitor isn’t going to be tackled by linux devs due to lack of actual demand, of which I proceeded to explain further

The engineering involved in manufacturing the parts though, from 8 to 10 or 10 to 12, are more than doubling in difficulty, just by the constraint of difference of actual signal bandwidth they need to accomodate & sustain.

12 bit hmm?
and’t tell anyone sending a 8 bit+FRC display as 10 bit is done all the time.

at least you are far enough to say lack of demand with i cannot deny and not that they can’t get such a device…

the current technical problem is related to how linux deals with 10 bit and they want to run everything in 10 bit unlike windows that runs the desktop at 8 and is totally independent from the GPU driver output and only using more then 10 bit with professional hardware by breaking RGBA 8 using openGL.
windows does 10-16 bit using fullscreen surfaces.

linux can do 10 bit in it’s not working state for a very long time now…

1 Like