Sapphire: AMD will unveil two Navi SKUs on Monday

For 4k gaming it gets a bit tight, for VR it would certainly help to get the extra performance.

So basically 2060 and 2070 but Ā±0, mostly under because the AAA selection is more interesting

1 Like

It would be cool if we could use one GPU per eye on VR. That would probably require a complete rewrite of drivers, Vulkan/DX12 and game engines, but it would be the best way to use dual GPUs IMHO.

That would be sickening. It is annoying when the frame rate jitters due to microstuttering but if your eye started to get out if sync it would not end up well.

2 Likes

I think the only concern with power draw is not costā€¦ itā€™s heat. More power draw the more heat you need to dissipate.

1 Like

Arenā€™t microstutters caused because the 2 GPUs rendering to the same screen? That means the secondary card has to write the framebuffer of the primary, which caused latency issues. If the VR unit has 2 inputs, each connected to one GPU, that wouldnā€™t be the case, right?

1 Like

There are two ways to do it. Split Frame Rendering where both GPUs work on the same frame and Alternate Frame Rendering where the GPU that does the work alternates.

For VR, you would basically have two independant GPUs running side by side.

Probably best to still use 1 gpu, snyc of the images would still be important as something showing up early in 1 eye vs the other might not be the best and might cause some sickness for the user possibly tearing as well

1 Like

Thereā€™s also DX12 and Vulkan explicit multi-GPU, which is really cool technology that nothing supports because nobody has multiple GPUs these days.

2 Likes

Overwatch has superb GPU scaling, is DX11 though.

Yes, and even minimal differences in silicon (no two GPUs are exactly the same) wouldnā€™t matter at that point.


Personally Iā€™m good on my gaming rig with Vega64 but a 2060/1070 equivalent could be nice for my other desktop. (RX570 currently) But I would likely wait for phoronix or L1T to test it on linux first.

1 Like

Well microsoft dropped DXR after nvidia started having issues.

Which is like the best troll ever btw.

1 Like

Their raytracing stuff? I mean that is hilarious but now they have taken a standard potentially open to all on windows and made it so nVidia will just have another Physx/Gameworx.

FUCKING YAY!

Huh? DX12 DXR is open to AMD, Intel, and everybody else. So is Vulkan.

1 Like

I mean not really but ok.

So they have not then?

You made it sound like dxr was only for nvidia.

Iā€™m perpetually stoned, excuse my idiocy.

2 Likes

To be fair, there is more content about Navi there than my post anyways since half of it was me ranting and saying that it isnā€™t looking too good.

Iā€™d be shocked if it were worse than Pascal though. I mean worse than Turing in power efficiency is almost a given since AMD is architecturally behind by a wide margin.

7nm is such a massive gain, so if you have worse perf/watt than your competitor on a 16nm++ node you need a new architecture.

2 Likes