Can a DP1.2 monitor support HBR3?

yes I got that part, I was asking if anyone tried it. Looking online, it appears this dithering feature is not officially supported on nvidia and is buggy when forced via a driver hack. Seems not worth the trouble.

Yes. so in your opinion is it worth returning the GB for another 8bit+frc monitor like the MSI to get 10bit 170Hz?

even games that support HDR? or games in general?

FRC transformation should be performed at monitor level, so no driver hack should be needed at all.

GPU sends normal 10bit signal, display device does its own thing.

RE: return
Try it out first, its niche concern. Panel quality also matters, bad 10bit panel might be worse in the end than very goo 8bit+frc. So test it with your own eyes.

Otherwise its kind of like 4k monitors, once you get used to them, you cant go back.

I had personal experience with semi-pro photography and had pro monitor, now I cannot unsee those graphical artifacts.

HDR support itself as indicator is maybe. ITS requires larger colouspace due to expanded gamma, but its not the same as 10bit mastering.
Additionally even in-game HDR support can be half assed, but despite having true HDR capable monitor, I dotn play games in HDR. So no real experience there.

I dont think I have ever seen anything consumer that was explicitly 10bit mastered. And I dont expect it, given the size of target pool.

I found these samples at Kodi. I can’t tell the difference here.

While searching I found one game (Alien Isolation) that claimed 10bit mastering.
but some people on reddit say 10bit makes a difference in terms of color gamut in some games.