CES 2019 - Consumer Electronics Show

Nah dude, you clean up well. :kiss:

1 Like

But couldn’t they just get HBM that clocks better? Seems like it would still cost less money. Or at the very least, shave off 4 GB of HBM and cut the price in $100. You would have a $600 GPU that has 12 GB of HBM.

1 Like

I posted this in the other thread, but the real story is it’s a binned Vega 20. It was never intended for consumers/gamers, but Nvidia made that possible by not focusing on rasterizing performance and jacking up prices on Turing.

I still wouldn’t buy one. Even though it’s the same price/performance as a 1080ti/2080, it doesn’t have ray-tracing and will undoubtedly run much hotter.

2 Likes

The system looks nice, like really really nice.

One note: There are some jumpcuts where the image zooms in and out, would avoid that.

1 Like

We’re still all new to this and we had to cut out the part where Joe rambled on about how he’s a dissapointment to his wife.

2 Likes

Would have to look for datasheets and get to know the clocks they are running.
But $100 for 4GB HBM is a very high estimate. 30 to 15 is the expected price cut per stack.
( We may have to move to here )

2 Likes

They literally announced the name you don’t have to speculate on what it’ll be called… Also I assume that 4GB stacks aren’t actually much more expensive than 2GB stacks anymore, so AMD decided to provide an actual increase when moving back to 4096-bit rather than stagnating and risking VRAM limitations in future applications (something that Fiji got railed for).

3 Likes

Nope, it’s a binned vega 20 and the instinct MI50 has 16GB of HBM2.

You know what else could have fixed that issue too…GDDR6 or even GDDR5/X.

Vega could run with DDR4, and HBM2, why not GDDR5/X or 6? There was 3 ways they could have kept the GPU price down and 2 of them won’t sacrifice performance much.

I am hoping Navi somehow cannibalizes Vega II because the RTX will eat it alive.

It’s not so simple, they would need to completely redesign the memory controller, and Vega is notoriously memory bandwidth starved as it is. But I agree that if this GPU was intended for consumer/gaming use, they should have gone with GDDR5X or GDDR6.

Okay fair enough but then why release this card period? It doesn’t have any merits in the market it’s competing in over the also overpriced RTX 2080 and I thought there wasn’t going to be 7nm Vega for gaming.

Is it defective version of Radeon Instinct or something? If so how much does that GPU cost?

To AMD’s credit, it is THEIR most powerful gaming GPU without a doubt. They got very close to GTX 1080 Ti performance and surpasses the GTX 1080 most likely. Problem being it isn’t competing with the 1080 but the 2080.

As far as I can tell from specs released so far, it looks to literally be a rebranded instinct Ml50. I haven’t seen prices on the instinct series, but it’s certainly a lot more than $700.

They released it for the two reasons I stated; Nvidia didn’t focus on performance in existing games and Nvidia jacked up pricing to a point where AMD could afford to sell it and make money.

It competes with the GTX1080ti and RTX2080, not the GTX1080.

2 Likes

GDDR6 does not provide a significant bandwidth improvement over HBM2, and would require a substantial redesign of the core and supporting components such as reference PCBs. Vega is extremely bandwidth starved and by simply doubling down on HBM2 they also immediately double bandwidth.

Vega 64 was the competition for the GTX 1080 and traded blows with it handily. Some overclocked Vega 56s can also do the same. Radeon VII is going against the RTX 2080 and in extension the GTX 1080 Ti/Titan Xp. The original GTX 1080 should not be even remotely close to competition for this card.

2 Likes

Someone is a bit… salty.

1 Like

Well, if they turn on DLSS I suppose that’s true, as you’re effectively only rendering half resolution then upscaling it with super magic AI. And it straight-up doesn’t support raytracing.

The freesync thing is all FUD. The Nvidia booth has a freesync demo showing a non-certified monitor performing like dogshit, flickering, terrible. But we all know freesync works fine on AMD, even on monitors with very limited variable refresh windows, so that’s an issue with Nvidia’s implementation, not the VRR standard.

2 Likes

Hopefully the Radeon VII does actually give the 2080 a run for the money when the AIB boards comes out. Some price drops on both would be nice.

“Freesync doesn’t just work”

Uhh excuse me Mr.CEO, isn’t your company certifying Freesync and finally supporting it? Yeah, he is salty that his precious G-Sync failed. But he doesn’t need to worry much, Vega II is just as overpriced.

1 Like

DLSS is in one game, right? And has a lot of problems with small objects popping in and out of existance.

Raytracing is in one game, tanks performance and you have to look for it to notice it.

FreeSync - Am using it, no problems to report.


That is legit fear, covered by anger.

1 Like

DLSS still sounds like black magic to me. I haven’t seen any compelling evidence that it really works, but I’m open to the possibility.

Raytracing definitely works, and is a substantial improvement in immersion. Problem is it’s in only one game, and that’s a multiplayer fast-action title so you don’t notice it unless you look for it. It needs to be in an exploration game like Bioshock.

@MazeFrame: None of its competitors have announced products that should make Nvidia concerned in the slightest about their graphics business. Nvidia made a huge gamble on raytracing and so far, nobody has taken advantage of that opportunity.

An OC’d GTX 1080 or even an OC’d GTX 1070 Ti could probably get close to this card’s stock speeds, for much less. Not that I recommend overclocking but if the newer cards are going to be that much worse in value, may as well take the previous high-tier cards and upclock them. Oh, and if you want AMD, overclock them Vega 56’s and 64’s.

Like if I overclocked my i7 5820K to match a stock R7 1700.