For 4k gaming it gets a bit tight, for VR it would certainly help to get the extra performance.
So basically 2060 and 2070 but Ā±0, mostly under because the AAA selection is more interesting
It would be cool if we could use one GPU per eye on VR. That would probably require a complete rewrite of drivers, Vulkan/DX12 and game engines, but it would be the best way to use dual GPUs IMHO.
That would be sickening. It is annoying when the frame rate jitters due to microstuttering but if your eye started to get out if sync it would not end up well.
I think the only concern with power draw is not costā¦ itās heat. More power draw the more heat you need to dissipate.
Arenāt microstutters caused because the 2 GPUs rendering to the same screen? That means the secondary card has to write the framebuffer of the primary, which caused latency issues. If the VR unit has 2 inputs, each connected to one GPU, that wouldnāt be the case, right?
There are two ways to do it. Split Frame Rendering where both GPUs work on the same frame and Alternate Frame Rendering where the GPU that does the work alternates.
For VR, you would basically have two independant GPUs running side by side.
Probably best to still use 1 gpu, snyc of the images would still be important as something showing up early in 1 eye vs the other might not be the best and might cause some sickness for the user possibly tearing as well
Thereās also DX12 and Vulkan explicit multi-GPU, which is really cool technology that nothing supports because nobody has multiple GPUs these days.
Overwatch has superb GPU scaling, is DX11 though.
Yes, and even minimal differences in silicon (no two GPUs are exactly the same) wouldnāt matter at that point.
Personally Iām good on my gaming rig with Vega64 but a 2060/1070 equivalent could be nice for my other desktop. (RX570 currently) But I would likely wait for phoronix or L1T to test it on linux first.
Well microsoft dropped DXR after nvidia started having issues.
Which is like the best troll ever btw.
Their raytracing stuff? I mean that is hilarious but now they have taken a standard potentially open to all on windows and made it so nVidia will just have another Physx/Gameworx.
FUCKING YAY!
Huh? DX12 DXR is open to AMD, Intel, and everybody else. So is Vulkan.
I mean not really but ok.
So they have not then?
You made it sound like dxr was only for nvidia.
Iām perpetually stoned, excuse my idiocy.
To be fair, there is more content about Navi there than my post anyways since half of it was me ranting and saying that it isnāt looking too good.
Iād be shocked if it were worse than Pascal though. I mean worse than Turing in power efficiency is almost a given since AMD is architecturally behind by a wide margin.
7nm is such a massive gain, so if you have worse perf/watt than your competitor on a 16nm++ node you need a new architecture.