RTX Ray tracing

25% larger die than 980ti, 2x the price, 1 generation between the two cards, i.e. actually 2 generations later… GG nvidia fanboys, this is what life is like when nobody buys AMDs stuff when its competitive and AMD stops investing in new architectures… And then NVIDIA finally realizes they can charge whatever they want.

1 Like

well… i mean lets give them the benefit of a doubt. maybe turing is actually really good for ai and all the other buzzwords and gaming is actually really good.

2 Likes

From what I understand RTX raytracing wont be like Nvidia Gameworks. Microsoft has an update to the DirectX API with hooks for the real time raytracing already baked into Windows for games. It remains to be seen if Nvidia still does something proprietorial.

Any word if this stuff helps pro users? Did Nvidia announce a Quadro RTX? Can a mostly CPU raytracing program, like Cinebench, use RTX?

1 Like

I haven’t been following the development of the RTX at all. But I thought Nvidia was using some sort of voxel octree approximation of ray tracing? I’m sure these cards will be great for pre-rendering scenes that feature ray tracing, and possibly even use it for real time cutscenes. But, can it handle in game rendering?

Correct. Raytracing is coming to DirectX and “RTX” is merely what nvidia calls their driver implementation. AMD can just as well plug theirs (“Radeon Rays”) into the DirectX driver to support raytracing on their own cards. AMD doesn’t currently have hardware acceleration for ray tracing however.

Raytracing consists of two major parts: The raytracing itself and shader computations. Doing the raytracing to the GPU is pretty easy to do even on a legacy, CPU only renderer. Just don’t expect them to start running their shaders on the GPU as well.

Note that RTX isn’t anything new though. Nvidia has offered a raytracing SDK for years (“OptiX”) so there’s no real reason why renderers would suddenly start using it, just because it has a new name.

1 Like

You’re thinking of their old VXGI (https://developer.nvidia.com/vxgi). I’m 99% certain RTX uses regular triangles and not octrees.

Absolutely not. All of the demo scenes they’ve showed were either noisy or only had very few effects use raytracing. Expect games to continue using rasterization for rendering and then add in a couple ray traced effects on top. The most likely candidates are better reflections, ambient occlusion and dynamic indirect lighting.

Oh, OK. I remember Nvidia showing off presentations of VXGI in the past. I thought this was an extension of that.

Yeah, that’s what I figured. The idea of real time ray tracing for everything is crazy. This is a first generation implementation of ray tracing in a consumer card, so I expect it to be used in smaller instances. It is a step forward in real time rendering, but at the same time it still seems like a marketing point at this time.

For existing games, Turing will deliver 2x the performance and 4K HDR gaming at 60 FPS on even the most demanding titles. via NVIDIA

Benchmarks should be here soon™

https://www.anandtech.com/show/13261/hands-on-with-the-geforce-rtx-2080-ti-realtime-raytracing

NVIDIA GeForce RTX 2080/RTX 2080 Ti Performance Review Embargo Ends 14th September

Oh these charts look so fishy…
If they had tangible 50% perf improvements to show off, they wouldn’t use that abstract “50% per core shading improvements” with games chart.

They’d put real numbers on the table, and they’d be shovelling in money from pre-orders if the actual numbers where that good. This looks like yet more marketing wank.

3 Likes

yep! it’s really really strange! i hope for vast improvement for the sake of gaming and gpus and such but in the live stream and now … like you said, there are no real black & white numbers…

…i wonder that is because it takes 3 weeks after launch to release benchmarks…

anyone feels the tinfoil creeping up?

1 Like

Except the pricing, die sizes and shader core count… And the die node size and release schedules for 7nm…

Certainly Nvidia are continuing their push for higher margins, I am all for that, all for capitalism and competition, but cmon… They have licenses which basically make it illegal (patent infringement) to compete, and they used them to drive competition out of the market…

Surely if the ram makers can’t just make up a price and charge whatever they want then neither should Nvidia… Nor should they be able to pay other companies to put ‘nvidia the only way to play games’ at the front of every game… Its basically telling people they are going to have a crap experience, unless they use a Nvidia card…

Until recently that is frankly not true… In fact for most of gaming history whatever you brought, at whatever price point you brought it at, you actually got a better performing card if you brought team red…

Because team red would basically always undercut the price of the other side even if they had to sell a big die for a small price, and make low margins.

1 Like

I think they did and it was like 20 grand.

Do we know whether RTX has native support for motion blur? Real time raytracing is nice and all, but for offline renderers like blender to benefit from the raytracing hardware there has to be support for motion blur.

Embree, OptiX and Radeon Rays all have this functionality built-in, but I haven’t seen anything about RTX so far.