[AMD/RADEON] Navi NextGen™ GPU Rumors & News TINFOIL HAT Edition

So, I’ve taken the liberty of using AdoredTV’s leak to create a supposed benchmark based on GamersNexus’ values. These are the (probably wrong) results:

navi_f1 navi_fc5 navi_s4e navi_sottr

Edit 1: Those are 4k benchmarks

Edit 2: AdoredTV’s leak states RX3080 has 1.15x Vega 64 performance.

Edit 3: Added RTX 2080 FE and made background darker to avoid eye cancer :smiley:

2 Likes

Even if it is +/- 15%, depending on price and VRAM, not bad.

2 Likes

Yeah that’s true, even though I assume the RX3080 will be a 1440p card and not a 4k card. (That being said, 4k should be feasible depending on one’s preferences)

1 Like

Pack it in boys. We have one more year of Nvidia bringing out the bare minimum upgrades to ding our wallet and for Intel to buy enough talent to get it on the GPU game.

AMD CPUs have been pretty damn good, I guess it was inevitable for them to let other things go to get it.

AMD Navi 20 GPU To Be Featured on Enthusiast Grade Gaming Graphics Cards, Features Enhanced GCN Architecture and Ray Tracing Performance Equivalent or Faster Than an RTX 2080 Ti

Other than that, we are hearing reports that AMD Navi GPUs may introduce features such as Variable Rate Shading which would be available across the Navi architecture. We have also seen various reports which indicate that Navi GPUs would be the underlying graphics architecture of next-generation consoles from Sony and Microsoft so looks like Navi, while being the last GCN based GPU architecture, might end up capturing multiple markets for AMD and be competitive at the same time.

So starting off with the Navi 20 details, we first have the alleged Ray Tracing support and to be honest, I think it’s very likely that AMD would introduce their own take on Ray Tracing, especially when Crytek recently demoed their first ray tracing demo based on their CRYENGINE on an AMD Radeon RX Vega 56 graphics card. AMD would like to support Microsoft’s DXR API and provide a more open-source ray tracing feature support as they have done so with many other technologies in the past, e.g. TressFX, Freesync, OpenCL.

takes big sip of kool aid

2 Likes

I’ve seen that before… looks at Radeon V2

3 Likes

Since the Jan keynote. I have zero believe…release it and I will care.

I know navi will cover low end hardware.

God knows how much money Sony pumped into AMD to have the first console to do true 4k gaming (or VR whatever).

It can’t get worse though.

https://forum.level1techs.com/t/topic/140607

This topic was automatically opened after 5 hours.

As for the AMD Radeon RX Navi GPUs, we are looking at a mainstream lineup which would replace the 14nm Polaris based Radeon RX 500 series family while offering better performance per watt and also some modern graphical features such as Variable Rate Shading and the supposed support for Microsoft’s DXR API. Now it’s not confirmed whether AMD RTG would be talking about the actual products or the underlying architecture of Navi GPUs. Currently, it looks like they will only be giving us an overview of the architectural details with a proper launch scheduled for mid or second half of 2019.

1 Like

The number of people who spend more than $300-400 US on a GPU is… very small.

The 2080 ti may be hero card of the year, but the number actually SOLD is… small.

The vast bulk of the market is buying 1060s (or 1660? whatever the nvidia called its replacement), maybe 1070-1070ti or RX580 or lower.

Radeon VII (and vega before it, and to some degree fury before it) excels at stuff the average gamer doesn’t care about.

If you just want maximum game FPS per dollar that’s where navi is going to target.

If you want extreme high end, Navi is not for you.
If you want best performance in some non gaming workload, Navi is not for you.

That’s not what it’s aimed at.

1 Like

Re: crossfire

It’s dead along with SLI.

  • For GPU compute it is irrelevant. You can use multiple cards for that without crossfire/SLI being involved
  • for DX12/Vulkan - other multi GPU implementations are coming that are actually built ground up for different/multiple GPUs. Crossfire and SLI is/was a hack - that for the most part didn’t work very well
  • few games demand more than a single card is capable of. Rather than buying TWO shit tier cards, just buy one decent card and don’t deal with the multi-gpu headaches. The number of multi-GPU machines out there is tiny, unless you’re talking discrete GPU plus onboard APU/intel integrated graphics. And crossfire or SLI won’t work with those, hence DX12/Vulkan multi gpu. But the short story is that multi gpu setups are simply NOT worth a game developer’s time. there is zero ROI. You’re talking a huge amount of headache, code optimisation and support costs for 1% of 1% of the market. Non-starter.
2 Likes

That’s not what it’s aimed at.

I think AMD’s GPU offerings will be pretty tame until they start making more money that they can dump into R&D. They have had some serious adoption in console platforms, a start to mobile platforms and partnerships with hardware vendors like Apple. I don’t think they are afraid of the research, but they may be looking to have it covered by direct deals.

1 Like

Good news is I have a quality 1080Ti so I don’t have to worry about fuckin around with graphics or monitor anytime soon… bad news is I would love something more powerful from AMD that can easily handle 4K and preferably higher than 60 FPS. Seems like Im staying at 1440p w/144 for years to come.

I would agree. Google Stadia mentions it. I suspect it will be some custom API they make.

At the price point high end graphics card are at its dead.

1 Like

For 100% certain, given Google’s platform will be Linux and neither crossfire nor SLI are Linux compatible at the moment.

Yer your pretty good until consoles catch up. Game devs keep coding to consoles. At least the next ones have 8G Vram I believe and need multi core coding. Not to mention more ram for the game to run in.

Statia as well is 2.7Ghz cores so multithreading is needed. Good for PC gaming.

Would be nice to see games with 4k or 8k textures 12-16G Vram and 32G of ram. You can mod engines with textures but the game is usually coded to fit in a consoles ram.

If Stadia takes off (and i think it will for normies playing non twitch titles due to no download required and game rental) i don’t think we’ll see that for some time.

Because of the compression for the streaming. For most people, 1080p is good enough, and 1080p with less compression is better than 4k heavily compressed IMHO due to less compression artifacts. And if that’s the case, 4k or 8k textures are pointless.

I could be wrong. But for non-fringe extreme high end hardware gamers out there, 1080p is plenty good enough. Better to put the resources torwards more detailed models, better physics etc. to create more believable, engaging game worlds than just bumping pixel count at this point IMHO.

1 Like

And that’s the issue Navi has… it has to get people with all these 1080TI’s off the card and into an AMD platform many of whom will have to change monitors because we had to buy G-Sync. That’s a tough sell… I’m thinking 25 to 35% improvement minimum over 1080Ti to get switchers. NVidia is struggling to do that with the 20XX’s as it is and that’s without a monitor update.