8 bit 4:4:4 or 10 bit 4:2:2?

Hey guys,

Recently purchased a samsung q80r. It’s great, but to my dismay I discovered that the bandwidth of hdmi 2.0 can’t support 10bit 4:4:4.

So I’d have to choose. Which one would you guys recommend more?

When you set an output other than RGB on your GPU it has to do a colour space conversion which it does using a fast and lossy method which usually results in a loss in image quality. So I would stick with 8-bit RGB.

1 Like

Thanks! Are there any side effects to using RGB on a TV connected to a PC? As in, why would people generally use 4:4:4 on a TV connected to a PC?

I would suggest you read this:

It explains it including pretty pictures

Isn’t 4:4:4 still RGB though, just with finer grading? Or am I missing something?

Most TVs will take an RGB input, some (usually older ones) don’t so you have the option to convert.

No, it’s still a luma channel and two chroma channels rather than three colour channels like RGB. 4:4:4 can be mathematically equivalent to RGB, as in it’s a 1:1 conversion, but it can also be in a different colour space which needs conversion.

Oh that’s what you mean, I see.

However, there is also this on the aforementioned link:

It’s also important to note that subsampling on a PC requires using the YCbCr/YUV format as RGB does not support it.

So unless the TV takes a pure RGB input there is no way to avoid it, as I understand.

Yeah, even though all TVs have to convert to RGB to display an image they use YUV for the signal because it can be compressed using chroma subsampling (all video uses 4:2:0). So if a TV doesn’t accept an RGB input or you don’t have the required bandwidth then it’s unavoidable.

The problem however isn’t just that you are compressing the colour, as this isn’t that noticeable (like I said all video uses 4:2:0), the problem is that a PC can only render in RGB and has to convert to YUV, which is usually done with a fast lossy method to avoid increasing latency. So setting your PC to 4:2:2 or 4:2:0 looks a lot worse than a normal 4:2:0 video, it’s especially noticeable around text.

1 Like

Honestly, when I set it on 4:2:2 I can hardly tell a difference around text… I am running it at 4K, but maybe I need to do more testing (also I’m running at 225% scaling)

Also, I opened up Illustrator and played with some gradients, to see if I can find any banding. Couldn’t really tell 8bpc and 10bpc apart ¯_(ツ)_/¯

10 bit 422p is better quality than 8-bit yuv444p in terms of formats because displaying more colors is more important than displaying fewer colors more accurately in limited bandwidth scenarios.

However, in practice it can also depend on what is being converted to what, or if bandwidth is high enough then typical issues that occur with 8-bit conversions will not manifest.

Banding is really common in low-bandwidth encodes. If you want to see it happen in real time, convert .png with a gradiant to .jpeg and turn up the compression really high. Then set the compression to really low. Then do .tiff with 8-bit 444 and 10-bit 422 (tif supports yuv). Since the tiff is lossless (usually), banding should not occur during rgb->yuv conversions.

there is YCbCR 4:4:4 which is not subsampled at all.

and the reason RGB isn’t subsampled is clearly not because it is not supported it’s just super dumb and utterly useless for video presentation. a debayer filter is the closest to this and it creates RGB.

@rest because you set your GPU to 8 10 or 12 bit doesn’t mean the program you run is now magically more bit deep.

RGB <-> YCBCR is lossy and can easy created banding there is an exception for black and white images.

want to see banding here you go:

if this doesn’t get dithered by your displaying program you will get banding.

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.