Nvidia will fully support Async Compute with software drivers

Somehow I don't really believe them. Guess we'll have to wait and see

Nvidia will fully support Asynchronous Shaders with its GPUs through the use of new software drivers. At current the feature is not fully implemented by the drivers, but Oxide Games are working with Nvidia to help them bring the needed support.

We actually just chatted with Nvidia about Async Compute, indeed the driver hasn’t fully implemented it yet, but it appeared like it was said Oxide developer ‘Kollock’. “We are working closely with them as they fully implement Async Compute. We’ll keep everyone posted as we learn more.”Nvidia’s issues with Async Computer were made apparent with the DirectX 12 benchmark for Oxide Games’ Ashes of the Singularity, where AMD Radeon GPUs performed significantly better than Nvidia’s.

Oxide pinpointed the issue as being down to Asynchronous Shaders, one of DirectX’s new features. Nvidia’s GTX 900 series graphics cards are capable of handling Async Compute, but right now the software just doesn’t support the feature. But with Oxide Games helping the graphics giant out, hopefully it shouldn’t be long before those Ashes of the Singularity benchmark tests start to look a little more even.

7 Likes

After all the lies they told and shit they pulled in the last couple of years excuse me if I don't believe a fuckin' word of that.

The well made ones of those are pretty good. Sad but kinda true, ish.

Give me a reason to belive Nvidia is not doing what they did for the past nine years...

12 years. Surely people still remember the FX 5800 Ultra "Leaf Blower"?

You know that their source is that long thread over at OCN right? Mahigan's posts and the few post from the Oxide guy Kollock:

it really doesn't matter that they don't have async in drivers now because very few if any games use the feature unless you count games run under a custom version of dolphin.

besides everyones being a little harsh on nvidia considering they will have a major lead over AMD when thunderbolt 3 ports are released on laptops. due to optiimus and better overscan/underscan fixing functionality. not to mention nvidia the 900 series has support for hardware HVEC(H.265) decoding. lets also not forget they released DSR on all gtx cards going back to the 400 series. while AMDs VSR is still just barely out of beta working on specific GCN gpu's.

that said I will admit the gameworks crap is underhanded and should stop. with the level of optimization that should be achiveable using directx12 and vulkan I doubt developers will have any need for it anyway.

I really don't see the point of DSR and VSR, there's like no difference, and while it's nice that 400 series cards support it, they can't really utilize it to run games at higher resolutions.

Otherwise I think carrizo APUs got H.265 support, I wonder if DX12 is going to affect APUs as well.

But given how much it improves performance for AMD cards why wouldn't game devs use it?

I will have to disagree with you, on DSR. I run Portal 1/2/Mel at 4K, and it does look better. Also, i have tested it on Heroes of the Storm and it is quite magnificent. There is basically no aliasing what so ever. I plan on running multiple other games at 4K as well.

I'm sure it does a little something, but like you're stressing your hardware and using more power for a bit finer looking graphics, it's like how some phones and laptops have really high resolution screens, but it's almost pointless because it just makes your font look sharper.

The whole point behind VSR and DSR is to take advantage of higher resolution rendering on a 1920x1080 or 2560x1440 display to force your graphics card to use its full potential, in the case of the Fury X or the R9 X series. Even force 1920x1080 on the R7 series graphics cards with a cheap 720p or 900p monitor.

It does more than a little something. Like i said, in the games i have used it with, it virtually eliminates any aliasing. For instance, when i was playing Heroes, i was at the character select screen at native 1080p. I picked Diablo and was turning him around and noticing all the aliasing on his arms, claws, horns. It was somewhat distracting. Applied the 4K resolution and it was just solid lines. It was quite awesome. And the textures pop a bit more as well.

You realize just how insidious the lies from Nvidia actually are when you still had a Fermi card a month after the 600-series launched. We were all told that a single 680 got the same FPS in the Unreal 4 demo that 3-way SLI 580s got. They didn't tell you that not only was 3-way SLI scaling only 80% better than a single 580, but that they were comparing it to much older 580 drivers. Sure on launch for a solid month for all the benchmarks to come out the 680 was handily beating the 580, but then a massive driver update launched the 580 neck and neck with it. I know this for fact as I had an EVGA 560ti-448 Classified Ultra (still have it but mothballed), and when that driver update finally hit and at it's full OC potential I was matching baseline 680 benchmarks. A 560ti-448 OCed matching a baseline 680. Annoyed me enough to jump back to AMD. But of course, these lies are often founded in the marketing hype behind everything and have swung both ways many times. Telling us the hype, omitting the rest.