Return to

The VEGA 56 / 64 Cards Thread! General Discussion



Nope, sorry. Just wanted this for myself and felt like sharing.
It is a bit of work to get the light right and in LR I didn’t even have to remove that much.
That would be very different for the whole thing.


No worries, thanks. Was just hoping you had cropped it…


From seeing allot of benchmarks and numbers,
i´m personally a bit dissapointed about how Vega performs in gaming.
We had to wait long for this…
Of course drivers will improve performance over time.
But still, i was kinda expecting a bit more out of it.
Still i dont see any reason not to buy it if the price is right.


I would argue they’ve been going to shit since they allowed the marketplace.


Timestamped. Almost 15 minutes about the pricing BS. And I agree with most of it.


The review embargo alliance thing would actually be great I think.


Reworked my Vega wallpaper in two colors. Full size downloads here.


According to the article done at GamersNexus, it has basically been confirmed that AMD themselves has increased the price due to no-longer offering an applied-at-sale rebate that was given to newegg and other retailers when they went to buy cards. Prices will creep up due to miners, but the increase from $500 to $600 on V64 and $400 to $500 on V56 is due to AMD.


Yep, didn’t see it earlier but this is probably very telling too:


Apently Vega is not reporting the corect clocks.


This seems to all be an extension of Vega hardware not being meant for games, and AMD trying to force it to sell, then trying to bait and switch the prices into a profitable range.

But that begs the question, what is it about HBM2 or AMD’s new caching system that sucks/is-not-optimised-for on games, but makes enough of a difference on the pro/server side that AMD was willing to make Vega almost non-viable for gaming?

Put another way, why does Vega seem to be fine for the Radeon Pro and Instinct product lines, but for RX the costly HBM seems to make it overpriced and do little else?

Especially when they advertise Radeon Pro for VR, I really just don’t get it. How can games, and presumably VR games as well, suck, but not Pro VR?


Buildzoid mentioned that in his last rant video a day ago or two. Honestly, software not showing the correct whatever number for something brand new… should not be that shocking anymore to anyone who has been around for a few hardware generations.

So, damn close to 1080 performance “sucks” now?


I was under the impression that it under-performs for its price (when AMD does not subsidise via the now-discontinued rebate); or at least that the Vega56 does.

“Suck” was a bit harsh I suppose, but the disconnect between Game vs Pro performance is baffling to me when it sounds like it also applies to “Game VR” vs “Pro VR”.

Rendering/Compute vs Display having widely different performance makes sense to me, but Display vs Display does not.


No clue what you are referring to. Can you show me what you mean with “Pro VR”?


Prior to release I thought I saw VR tests among the things AMD let people test long before games. They also mentioned VR in the advertising for the Vega FE or something Radeon Pro related.

Given your reaction, maybe I am mis-remembering this, but the impression that I got was that the Vega architecture works well for VR product modelling and things tested by SPECviewperf, but not so well for games.


Where are you getting your facts in relation to HMB2 and AMD’s cache controller sucking for games? To my knowledge there are no games out yet that utilizes such technology, nor have there been any discussions from developers about said technology.


And you are forgetting to mention that it isn’t even functioning properly yet.

Will be interesting to see how RX Vega matures.


But does SPECviewperf specifically utilise it either?
I thought the consensus was that the Vega architecture did very well in such things and was competitive with Nvidia Pascal/Volta for professional work but not for gaming.

That is why when AMD talks about VR as now a part of professional work, it throws me for a loop.


I don’t understand graphics processing architecture that well so I couldn’t tell yeah. I was only interested in where you got your opinions about HBM2 and the HBCC from.


I was just thinking in terms of HBM2 being a large part of the cost for Vega RX cards with seemingly very little to show for it; and seeing how much they talk about it, it seems like HBCC was a significant part of development costs.