Nvidia has DISABLED 10 bit output on older GPU's

Do you have an older Nvidia GPU? If so you are probably having the same problem I had.

Nvidia has seen fit to DISABLE 10 bit output on older graphics cards in their new driver releases. I ran into this problem on my GTX 670. I contacted Nvidia support and complained about features being pulled and their response was we don’t and never have supported 10bit on the GTX 670. I told them it had worked for 2 years with no problems and they gave the same response. I rolled back the driver even though supposedly the old drivers has security verneraluilitys. I will buy an AMD GPU. I will resist purchasing any Nvidia products for a good while. I dont like having features pulled 10 years down the road in an attempt to force an upgrade. Once you go 10 bit you cant go back…

So the solution for you might be to roll back yoyr drover. I will edit this post with the working 10 bit driver version when I am back home.

Last supporting driver
436.48 for Win 10 64 bit

GTX 670
4k 10 bit HDR monitor
Cable recommendations

Club 3D Displayport Cable 1.4 8K 60Hz VESA certfied 3 Meter/9.84Feet 28AWG Black Color CAC-1060 https://www.amazon.com/dp/B07F85RQD2/ref=cm_sw_r_cp_apa_i_PjhpEbB30HFNH

Club3D CAC-1373 High Speed HDMI Cable 8K 60Hz Up to 10K 120Hz with DSC 1.2, 3 Meter/9,84 Feet Black, Male-Male https://www.amazon.com/dp/B07VK7JWH5/ref=cm_sw_r_cp_apa_i_AkhpEb2XNMQ7X

One could argue that there is indeed a reasonable point of obsolescence, no matter how annoying. I initially didn’t move to Windows 10, because I was happy on 7, and the EOL announcement didn’t sit well. But then I got a hold of myself, and I just got on with switching over. It’s really really not that bad, and it’ll benefit me in the long run.

In the likely event that doesn’t sway you, let’s not be coy here. Nvidia has a right to stop supporting hardware that according to you, is a LITERAL DECADE old.

Sorry.

4 Likes

This is sadly the new norm. Microsoft, Nvidia and Apple are are guilty of it. Apple about worst of all. They removed force touch without much fanfare in iOS13 because it wasn’t all that popular with developers, but what they replaced it with has made iOS13 a UI nightmare where if you hover a finger too close to the screen, popups happen or windows scroll unexpectedly which is very annoying since its difficult to hold onto the phone without at least one finger touching the screen.

Annoying, but when you play in somebody else’s closed garden, its difficult to complain when they say they don’t want to play anymore.

2 Likes

@Goblin They didnt stop supporting it they gimped by disableing features it instead. Theirs a difference. They still produce compatible drivers and I even understand if they leave new features out. But they have added new features then decided disabled one of the most important features for no other reason than to force a purchase. For this I shall purchase from their competitors. Before this happened I was looking at a 2080 ti but they lost that sale.

3 Likes

The feature could have been part of the issue and they dont have time to continually develop a fix on a card that barely anyone is using. If you dont like them for that then so be it. 8 years is a pretty good run I think.

Man, that sucks, I get it. But this doesn’t mean anything really. Games work on Linux that aren’t supported. Hardware works on my Dell T410 that isn’t supported. Same with software/firwmare, etc. Just because something works doesn’t mean it’s supported.

Go for it, I am always hearing good things. However, they do the same thing as Nvidia. They pulled their Radeon VII and it’s not even a year old AFAIK. They might push updates and patches, etc. But you seem bothered that a ten year old piece of hardware is starting to lose functionality… No one supports hardware for 20, 30, 40, 50 years… Except IBM, maybe.

2 Likes

How about AMD dropping features fourish years after release?

There was a system with a 7870 that had a monitor that needed some color tweaks to look ok. After a driver update, it started looking bad, so I tried to find the color settings. They had disappeared as AMD dropped the old settings menu entirely from the driver which contained the color settings. The new control panel did not have color settings. They did add it to the new control panel, but after a delay of like 9~ months I think.

Sources- https://community.amd.com/thread/209378
https://www.reddit.com/r/Amd/comments/5ouyja/get_your_color_settings_back/

That is a supported feature getting lost. You complaining about dropping an UNsupported bonus feature after a longer period most large companies support their products with new drivers.

They did not stop supporting it, because they never started supporting it. Sure, it was a bonus, but it was never an official feature according to them.

@TheCakeIsNaOH AMD dropped that for everyone and not just the people who they determine need to give them money. And you can edit monitor color profiles in your operating system so the graphics driver having color profiles is kind of rendundent unless its a work around for a hardware or driver related problem.

Something I would like to point out here is that Nvidia dropped the feature then claimed the new drivers patched security problems… I have not been able to find any information about the supposed security problems in the patch notes… So im calling BS on Nvidia. Links are welcome.

@Adubs Perhaps but you would think they would have mentioned something about it in the patch notes especially since they would be pulling a feature to fix it…

Page 14

It’s a feature they never claimed to support in the first place.

@OrbitaLinx Hey fam, is this what you’re looking for?

Behavior Change in NvEncCreateBitstreamBuffer API of Video Codec SDKIn the NVIDIA driver, Release 415 and later, the behavior of the Video Codec SDK API (NvEncCreateBitstreamBuffer) has been changed to return NV_ENC_ERR_UNIMPLEMENTED instead of NV_ENC_SUCCESS when the encoder instance is configured to run in the motion estimation-only mode. As an indirect consequence of this change, the AppEncME sample application from the Video Codec SDK prior to SDK version 8.2.16 will crash due to a bug in the NvEncoder class.The latest version of the SDK fixes this bug that affects the AppEncME sample application. NVIDIA recommends downloading the latest version from https://developer.nvidia.com/nvidia-video-codec-sdk

10 Bit is listed on that site.

@AnotherDev Thank you I have read the notes and they only mention the phrase security problems in the beginning but their is absolutely no information about the security problems in the entire document. The GTX 670 is listed as supported hardware and not listed in any feature pulls…

Right, they link to the CVE. Those things tend to be long.

@AnotherDev Thank you.

@Everyone If I am correct in my understanding of this problem It is driver related and should affect all supported GPU’s and the fix should have been applied to all of them.

In my opinion unless I am wrong? This security problem does not explain Nvidia’s decision to pull 10 bit output on older GPU’s on new drivers. And their has been no mention of this action nor has their been any explination in the patch notes.

https://nvidia.custhelp.com/app/answers/detail/a_id/4907

CVE:

CVE‑2019‑5690

Description:

NVIDIA Windows GPU Display Driver contains a vulnerability in the kernel mode layer (nvlddmkm.sys) handler for DxgkDdiEscape in which the size of an input buffer is not validated, which may lead to denial of service or escalation of privileges.

@AnotherDev Pertaining to your previous post about the NvEnc encoder. It is my understanding that you need to have a Pascal or newer GPU to support that. I had to figure that out because I was trying to record some gameplay and set up my PC for streaming. But I will check the link on my PC and see if I can get it to work.

@Adubs True. I remember their being some confusion around 10 bit output and GeForce cards in the past. I researched this when I got my 10 bit monitor. People seemed to believe that it would would be either hit and miss as to which cards allowed it. But I believe the confusion was primarily around how and where to enable it and monitor detection issues. It took me a minute to figure out that I had to enable it in the Nvidia control panel which I never had to use for monitor setting before getting a 10 bit monitor. Their were plenty of people on forums flaming Nvidia for favoring Quadro’s and touting that AMD allowed it in their cards. And making unsupported statements about certain Nvidia models not supporting it which I was able to test a few and they were wrong. In any case their were claims that Nvidia said they did not support 10 bit output in GeForce cards but I swear I remember seeing it on the box…

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.