I am playing with AMD adrenalin and I was curious they have 8bpc 10 bpc and 12 bpc. Now 8bpc is default at the native refresh of 165hz, 10 bpc at 144hz, and 12 bpc at 120hz. The native refresh rate of this monitor is 165Hz.
Question 1: If I lower it to 120Hz for 12 bpc, will that harm the monitor?
Question 2: can anyone tell the difference from 120hz to 165Hz?
Question 3: can anyone tell the difference between 8bpc, 10bpc and 12 bpc?
I notice frame drops by switching the bpc to a higher bpc.
I’d venture to guess your monitor supports Freesync, so it supports vertical refresh rates between ~30 and 165 hz. In any case, lowering the refresh rate will not damage the monitor.
In regard to color depth, without knowing the model of your monitor I can’t tell you if it supports 10 bit per channel (bpc). Certain high refresh rate IPS monitors include 10 bpc support through temporal dithering, meaning they quickly shift a single pixel between two colors to produce a third “in between” color, thereby allowing a greater range of colors to be displayed.
If you’ve got Photoshop, or any other application that supports 10 bpc (referred to as “30 bit display” in Adobe software), you can easily tell the difference between 8 bpc and 10 bpc by creating a gradient from black to white.
While an 8 bpc monitor can display only 256 shades of grey, a 10 bpc monitor can display 1024 shades. For this reason, a simple black to white gradient will have banding on an 8 bpc monitor but will appear smooth in a 10bpc monitor.
It’s important to note that higher bit depths are included mostly for content creation; games, webpages, and the vast majority of video content is distributed at 8 bpc or less. As I alluded to earlier, even content creation programs like Photoshop need to include specific support higher color depths. Applications like Blender or GIMP cannot currently make use of higher color depth displays, and will always output 8 bpc.
Answer 1: No, if the monitor displays a correct image, there’s no harm by using a higher bit depth. It should be no different from running a higher refresh rate, except that the data is delivered as larger frames rather than more frequent frames.
Answer 2: Yes. I actually notice a clear difference in smoothness between the two. However, 120 to 144 is much more subtle, and 144 to 165 is similarly much more subtle. I didn’t expect there to be a difference when I got a 165hz display and was very surprised that it was as noticeable as it was.
Answer 3: Absolutely! But your monitor, software, or viewing environment will affect this dramatically, so it’s very likely you won’t see as large a difference as you might expect from quadrupling your color depth twice. There’s also the matter of software support, or lack thereof.
Outside HDR games, support is pretty much guaranteed to be nada at the best of times, and it may cause compatibility problems with some software. I wouldn’t recommend it for gaming or content consumption at this time.
Something that might make for a better time, though, is seeing if your display has an AdobeRGB color profile, and using that for SDR content. A lot of content will appear oversaturated and blue-tinted, but much of the older TV-PCgame or console game content was actually designed for NTSC, which is much closer to AdobeRGB than to SRGB.
Thanks to the answers. I have been running at 120hz (solid) at 12bpc. It looks absolutely gorgeous.
1: no, running at 120hz will save a bit of power and some wear on components.
2: marginal at best, you might notice a little movement blur on your mouse but thats about it.
3: 12bpc will have a broader spectrum of darks to lights and rgb. and overall a more accurate colour representation. basically a high dynamic range.
additional…
if you have a gaming gpu then 12bit wont matter. gaming gpu’s top out at 10bit.
if you want 12bit your looking at an enterprise level gpu.
This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.