Return to

CES 2019 - Consumer Electronics Show



“Freesync doesn’t just work”

Uhh excuse me Mr.CEO, isn’t your company certifying Freesync and finally supporting it? Yeah, he is salty that his precious G-Sync failed. But he doesn’t need to worry much, Vega II is just as overpriced.


DLSS is in one game, right? And has a lot of problems with small objects popping in and out of existance.

Raytracing is in one game, tanks performance and you have to look for it to notice it.

FreeSync - Am using it, no problems to report.

That is legit fear, covered by anger.


DLSS still sounds like black magic to me. I haven’t seen any compelling evidence that it really works, but I’m open to the possibility.

Raytracing definitely works, and is a substantial improvement in immersion. Problem is it’s in only one game, and that’s a multiplayer fast-action title so you don’t notice it unless you look for it. It needs to be in an exploration game like Bioshock.

@MazeFrame: None of its competitors have announced products that should make Nvidia concerned in the slightest about their graphics business. Nvidia made a huge gamble on raytracing and so far, nobody has taken advantage of that opportunity.


An OC’d GTX 1080 or even an OC’d GTX 1070 Ti could probably get close to this card’s stock speeds, for much less. Not that I recommend overclocking but if the newer cards are going to be that much worse in value, may as well take the previous high-tier cards and upclock them. Oh, and if you want AMD, overclock them Vega 56’s and 64’s.

Like if I overclocked my i7 5820K to match a stock R7 1700.


It is okay-ish


There’s no way to overclock a 1080 or 1070ti and get performance anywhere near a 1080ti or 2080. It’s a MUCH faster GPU.

The promise of DLSS isn’t just higher-quality antialiasing, it’s that you’re actually rendering the game at lower resolutions and then scaling it up via magic AI fairy dust, not losing any image definition, and gaining tons of performance.


It’s like what 30-40% faster? And yes, that is a lot and I don’t think you can actually close the gap at all but doesn’t Pascal overclock like an animal?


If you have ray tracing and can’t use it because NVidia or the company for the particular title you are playing doesn’t implement it… do you have ray tracing?


Intel is a 300 pound gorilla, just like Nvidia. And Intel has a history of just throwing huge sums at random ventures.

Raytracing, Nvidia casts a few rays, then has to do substential denoising. Nobody else has the experience to put so much compute into a product and the balls and spare money to attempt real time raytracing.


No, every Pascal chip hits a wall right around 2000Mhz. You can go higher with below ambient cooling, but that isn’t something you actually do for anything other than benchmarks.

@MazeFrame: I agree it was a huge swinging brass balls move. Could have ended the company, but they’re getting away with it.


There is a pretty well documented 20% performance gap between the GTX 1080 and GTX 1080 Ti at the standard ~1900MHz turbo clocks. You’re telling me you can run your GTX 1080 at nearly 2.3GHz to catch up and perform as well as the stock 1080 Ti, 2080, and supposedly the new Radeon VII? A core clock that most people can’t get with voltmodded cards running on LN2?

I’m skeptical.


And we have to wait until Feb now for reviews on the cards stock vs 2080 and overclocking potential. It is based on vega and well the 56 did overclock well. Its also 7nm now.

No one knows if this is a good overclocker or a dud.


AMD is working on Ray tracing too. Quick, everyone, start talking about how great a technology it is.


Raytracing produces nice images.
However, without multiGPU, I don´t see real time ray tracing before 2026


It is great technology. Chasing after higher resolutions is silly once you get to 4k, you can’t see the pixels anyway. Image quality matters. HDR matters. Immersion matters. Shadows, lighting, reflections, refractions, occlusion. Raytraced scenes look more real.

Nvidia is charging too much, this generation. But ultimately I think they made the right bet. We don’t need more pixels.1

As for Vega VII, it’ll be a 295w TDP monster and won’t overclock much. That’s my guess.

1: Well… VR would benefit from 8k. But lets not talk about that.


2080 is 210 to 240W while gaming, 50W don´t matter in my books. Says the guy who left his 80W soldering station on… facepalm


One of the features is Rapid Packed Math and ROCm is still not out I think.
Hopefully an open “RTX” will be out. How Nvidia play is it will take years for RTX to die. Like Physx, G-Sync.
Nvidia refuses to play Graphic Processing Unit and wants Nvidia Processing Unit.
Hurry up Intel and it will be 2 open source friendly companies vs Nvidia.


The root of the issue, imo.


It is open, DirectX has DXR and Vulkan is working on a cross-platform API. Metal supports it also, but not offloaded to hardware yet.

The only game supporting raytracing right now, BF5, uses DirectX DXR, not Nvidia’s proprietary OptiX API.


GN video

4 HBM stacks confirmed

Edit: Would not mind that silver bullet in my system. Looks nice.