How to set 10-bit color Depth?

Nvidia Control Panel is showing my new 8-bit + FRC (advertised as 10 bit) monitor as 8-bit. Supposedly, there was a way to change this. And supposedly software would recognize the FRC interpolation and see the display as 10-bit. Plus, with the HDR the monitor has, it would only make sense. What the hell can I do to change it?

You can set it in the resolution section of the nvidia control panel, but you will only be able to set it higher if you have enough bandwidth. So for argument sake if it’s hdmi 2.0 you can do 4k 60hz with 8 but or 4k 30hz with 10 bit.

So if it doesn’t give you the option to set 10 bit then try a lower resolution to check that it’s not just a bandwidth limitation.

Using a DP cable on my 4GB 960. Maybe it’s the GPU? :thinking: Hmm…

If the DP version of both the gpu and display is high enough to support the resolution then it’ll be the cable. But it may be the gpu as its old enough that it might be using an older DP version

2 Likes

So, I’m using a 1080ti in my setup, and just got a new monitor for Christmas from the wife.

The monitor is 165hz (DP 1.4) and 144hz (HDMI 2.0) and capable of 10 bit color on a 1440 display curved panel. I’m using a DP cable that meets DP 1.4 standards (rated for 8k 60hz).

Hardware;
GTX 1080ti - GPU
DP 1.4, 6’- cable
Dell D3220DGF - monitor

When I first setup my monitor I could see 10bit color as an option in Nvidia’s control pane. However, after making some adjustments, I no longer had 10bit as a option, and it was defaulting to 8bit.

I have read from one user stating “max refresh rate supported on DP standard is 120hz for 10 bit.” I read elsewhere that it’s a “custom resolution issue that causes the loss of support for 10 bit color in Nvidia’s control panel.” The later doesn’t seem to make sense as I’m using the preferred resolution (2560x1440).

Is there any truth to one of the above answers that my be causing my issue?

1 Like

Make sure the monitor has DP1.4 turned on. If you set an overclock mode for high FPS, turn that off. Try lower refresh rates. I had to go down to 98 Hz on my display, for example.

Yep. That did it. Went into Advanced Display Properties, Monitor, switched refresh rate. Now on 10-bit colour depth. YEAH BABY!

And on a serious side note. I’m really sketched out by the issue I had with my new monitor… I switched from my DP to HDMI and the problem went away… Then I decided to use DP again, but from another port on my GPU. And the problem is gone. Played a bunch of games… no issues. I actually posted to BenQ’s support reddit and they insist on contacting them anyway… :thinking:

It was something on the monitors settings that it didn’t like. I’m now running DP1.4 at 165hz and 10 bit color. I’m happy as all get out now! I never knew picture quality could look so good. I could watch HDR video’s for days on this thing!

Do you have an older Nvidia GPU? If so you are probably having the same problem I had.

Nvidia has seen fit to DISABLE 10 bit output on older graphics cards in their new driver releases. I ran into this problem on my GTX 670. I contacted Nvidia support and complained about features being pulled and their response was we don’t and never have supported 10bit on the GTX 670. I told them it had worked for 2 years with no problems and they gave the same response. I rolled back the driver even though supposedly the old drivers has security verneraluilitys. I will buy an AMD GPU. I will resist purchasing any thing from Nvidia for a good while. I dont like having features pulled 10 years down the road in an attempt to force an upgrade. Once you go 10 bit you cant go back…

So the solution for you might be to roll back yoyr drover. I will edit this post with the working 10 bit driver version when I am back home.

GTX 670
4k 10 bit HDR monitor
Cable recommendations

Club 3D Displayport Cable 1.4 8K 60Hz VESA certfied 3 Meter/9.84Feet 28AWG Black Color CAC-1060 https://www.amazon.com/dp/B07F85RQD2/ref=cm_sw_r_cp_apa_i_PjhpEbB30HFNH

Club3D CAC-1373 High Speed HDMI Cable 8K 60Hz Up to 10K 120Hz with DSC 1.2, 3 Meter/9,84 Feet Black, Male-Male https://www.amazon.com/dp/B07VK7JWH5/ref=cm_sw_r_cp_apa_i_AkhpEb2XNMQ7X

@OrbitaLinx, OP’s problem was solved back in December.

Yep. That did it. Went into Advanced Display Properties, Monitor, switched refresh rate. Now on 10-bit colour depth. YEAH BABY!

Oh thanks for letting me know. I think ill create a thread for my post just in case any one else has the same problem I have. Can we get this thread closed?

Up to @Prenihility

Makes sense. I had problem like this for a while but a Windows update or Nvidia driver update or both fixed it. It also created an issue in games that would cause tearing and hide my displays proper resolution and refresh rate… This issue was compounded for me because the refresh rate kept resetting to bad settings between games and reboots…

Err. Sorry about the late replies, folks. Haven’t signed into the forums in a while. Yeah, all is well. New monitor is sweet. 10 bit colour working great. And RTings finally released their fellatio-contest review on my monitor. And I had to see it. Needless to say it was pretty pathetic. Really brings OPINIONS to my attention and even though you might logically think sources like that simply use metrics to measure performance of a device, a lot of it is sadly opinionated. Scores lower than a VG27AQ but the VG27AQ just on paper alone is the inferior monitor. Idk. shrugs.

Anyway, the 10 bit colour issue is resolved.

Im just glad to know I wasn’t the only one having a problem. Makes me feal better about my monitor

1 Like

this bit deep topic is really getting sad…
so much misconception.

so you set and nvidia control panel to 10 bit and now you have a 10 bit and everything is better?
do you even notice how limited the number of programs is that even use 10 bit heck if you are using windows you desktop is still at 8 bit and the same for every windowed program…
while the rtings banding score is utterly flawed a 8 bit panel can in reality easy beat a 10 bit panel and the same for 8bit+frc.
did you even test the 10 bit to see what they may mean with banding issues?

bit deep has pretty much nothing to do with banding it the processing that’s important so if your processing is flawless bit deep only effect the noise level for presentation.
it’S so easy to prove and has been proven countless times.

Just get a decent non frc 10 bit display and test for your self. You can get an acer for around $300 and download some 10 bit native photos / art and switch between 10 and 8 bit. Go to commonly used websites that you are familiar with and compare between 8 and 10 bit. Watch blue ray quality films and compare the difference. Once you go 10 nit you cant go back. It also becomes easy to be disapointed by certain sites not using 10 bit for their site assets and you would be surprised which one to. Cough Netflix cough… Soo 8 bit. Now their videos might be 10 bit depending how old they are or how they were remastered but their log in page and other assets are PAINFULLY 8 bit. Don’t use the new photo viewer in Windows 10 to look at anything it turns the quality to swiss cheese of you can use the old photo viewer from 7 or a different alternative way better quality.

you know that setting your GPU to 8, 10 or even 12 bit doesn’t change that your windows desktop stays 8 bit and pretty much every program you are running can’t even use 10 bit. you need professional software and professional GPU to get 10 bit windowed or the application has to be fullscreen using d3d10 or newer.

and i suggest you to read about video “10 bit” and why that’s not simply not 10 bit RGB.
the topic are YCbCr -> RGB converstion and how a lossy codec works.

I have a few Quadro’s laying around.

Just because Windows UI is 8 bit does not mean that my directx, 10 bit background is not 10 bit. As far as websites go as long as you are using the correct browser you can view webites / web images in 10 bit and the windows photo viewer does show 10 bit images in 10 bit. VLC also does 10 bit video in 10 bit and does have a directx mode to proove it. Of course enabeling 10 bit does not magically make 8 bit 10 bit I am not saying that. What I am saying is I prefer 10 bit content and specifically seek and own 10 bit content and I cant live with out a 10 bit capable GPU and display and that 8 bit photos, video and complex web elements look like crap compared to 10 bit. I also play video games and I get very disappointed when they are released in 8 bit. Looks cheap and lazy and honestly the developer could had done better and didn’t…

I can see the difference between 10 bit and 8 bit on both quadro’s and geforce GPU’s as well as Radeon cards which dont play that is has to be a professional GPU game. But honestly neither does Nvidia at this point unless apparently you have an old GPU and they just remove the feature to force a purchase which is why I went with a Radeon this time.