Nvidia RTX 20XX Thread

Supposedly nvidia is working on bringing RTX to Vulkan:

https://www.phoronix.com/scan.php?page=news_item&px=NVIDIA-RTX-Vulkan-WIP

dunno still interested to see what they are doing if it even qualifies as ‘ray tracing’ other then to say ray casting is basically ray tracing or something

mean its one thing to say ray tracing is difficult mean can look up they had doom3 thing that was ray traced in real time, but if you eliminate 90% of the workload then its not as bad all the sudden

I love Steve’s reaction to Tom’s Hardware saying “There’s a cost… to saving money…”

2 Likes

Raytracing is not new technology, it’s been used by all renderers used for movies and photorealistic images for decades. So we know exactly what it looks like and how much superior it is to rasterization.

The raytracing support for Vulkan is developed by nvidia. RTX is only what they call their driver. Both the DirectX extensions as well as the Vulkan ones will be available to all vendors. This is not comparable to PhysiX at all.

Nvidia may be porting their stuff to Vulkan, but AMD’s is already open source on Vulkan… since March… when NV announced it as a gameworks feature.

Nvidia will be pushing Gameworks/DX with it, guaranteed.

1 Like

Why the speculation? We KNOW they are developing the Vulkan extensions: http://on-demand.gputechconf.com/gtc/2018/presentation/s8521-advanced-graphics-extensions-for-vulkan.pdf

As for RadeonRays:

  • It’s not comprable to RTX. RadeonRays is using the regular shader cores and is thus far slower than RTX. Nvidia’s counterpart to Radeon Rays is called OptiX and has been here for years.
  • Radeon Rays is not in vulkan. Whoever wrote the article has no idea what they are talking about. Radeon Rays is to Vulkan as Unreal Engine is to DIrectX.
  • Funnily enough I have never managed to get it to run on AMD hardware. Works fine on both Intel and Nvidia though. If you’ve ever been on AMD’s ProRender discord you also know how buggy it is.

Sure. But the Vulkan extensions will be the same as used by AMD. If nvidia’s technology is not gonna be adoped - as you say - neither will AMDs.

Haven’t read anything in this topic but here is my 2 cents:

The tech behind it is great. No question. Using AI to fill in missing pixels instead of actually computing all of it is the way to go. I am truly impressed by that and how they are using it.

And now the “BUT”: Nobody will use it. Because it tanks fps. And the studios not being sponsored to implement it … simply won’t. Because so far nobody has a card to use it.

For the consumer it is physX 2.0. Don’t bother … yet.

4 Likes

Exactly why i believe it is PhysX 2 as well. It just isn’t worth the frame rate trade off for the vast majority of players on current hardware.

It’s a step, and maybe useful for developers to play with. But killer next gen feature you just MUST have 20xx for? Please…

1 Like

I’ve put my thoughts in a new thread The case for raytracing [Not Tom’s Hardware Edition]

2 Likes

This is just the beginning. Ray tracing is the future for graphics. It’s not like Physx. Physx is a proprietary api.

Ray tracing has been around for a long time, it’s just the first time there has been anything powerful enough to do it in real time. It may take a few generations of GPU’s for it to be totally useful, but everything has to have a starting point.

Well Nvidias ray tracing is just as custom. It is not ray tracing at all. Its half ass’ed ray tracing to a point in time a frame needs to render then AI derp learning fill in the blanks as bestest as AI can. All custom not open for the world to imporve on tech.

So it either dies with Nvidia or it kills the other gardens. Cause there is no open source sharing like vulkcan…it Me or die.

3 Likes

I was ray tracing things with pov-ray in 1993. I know what ray tracing is :smiley:

There is a reason rasterization has been used for the better part of computer history.
The main problem with Raytracing in its current form is that the hardware is not fast enough (like @Marten said):

Yes, it may be amazing in 3 generations. Maybe we can have better multi-gpu support and VR that does not look like a tennis racket by then.


Ray Tracing is pure compute, right? And AMD excells at computeation. They may have the right idea…

True, but one of the reasons GPUs are fast is because they accelerate common operations with special purpose hardware. Example: Fetching the color of a texture requires blending between several pixels. This is just a few vector operations - something shader cores are very fast at. And yet GPUs include specialized hardware to accelerate the process, because it is so common.

The speed increase with raytracing is likely much higher. Imagination technologies for example claims their ray tracing hardware only requires 1/100 of the power of general purpose compute cores:

The company claimed that its ray tracing architecture needs 100 times less power compared to desktop GPUs from AMD or Nvidia, quoting that real-time ray tracing is possible within just 5-10 watts.

(Not sure how that is supposed to work, because no GPU burns 5 * 100 Watts, but you get the idea :smile:)

1 Like

5 to 10 watts… Photonic transistors and quarter watt laser? Mabye?

It’s only a “quantum leap” if its used effectively by those that code the games.

Yeah but its 25% larger than the 980ti and ‘only’ 100% more money… And yeah so what, margins are a bit lower on larger dies, you think they wouldn’t make more if they sold the die for their median price per sq mm of a big die, of about 800usd?

If all these components need to be bigger to fit the tensor cores and add ray-tracing support to cuda cores, then just drop the smaller dies and price points, and just sell biggger dies, for the same or slightly lower price per sq mm, make smaller margins, but because they sell more GPU they could make more overall.

They cant be making much on the $200-250 traditional X60 parts…

Yeah not so much eh, I don’t think its a quantum leap if every game level is covered in shiny puddles. I never thought of puddle reflections or bonnet reflections when hearing of ray tracing. I always thought windows, shadows, lighting, the way the object changes visually based on the lighting, mirrors, that sort of thing.

Take a car racing game for instance, you can have very realistic dirt/mud rain and stuff on the windscreen, think of a film of dirt being raytraced through. You could probably have very good looking fog/smoke on a battlefield game. Very good muzzle flashes and stuff… But never did I ever think they would cover the entire ground surface in water…

And trees, omg imagine the leaves on an immersive game like crysis. But cmon, reflections in puddles? I think when Ray tracing actually takes off devs will look at ways to implement far better effects at a lower price (like simply dynamic lighting). But of course this would require a standardized API sort of like we have now across consoles and PC…

Even if Nvidia controls 95% of the computer GPU market game devs aren’t going to want to do the lighting twice, once using game works and a second time using the console apis. So game works will just be tacked on dog poop.

1 Like

The problem with big dies isn’t that you pay 25% more for 25% more silicon.

it’s a little more complicated than that.

25% bigger increases the failure rate for all of the dies on the wafter.

So not only are you paying for more silicon, you’re also paying for a higher failure rate on chips in that wafter that didn’t work.

Plus there’s nothing competing at the moment, so…

I hate nvidia as much as anyone, but the price increase is “somewhat” justified by the die size.

No, it’s a quantum leap precisely when pretty much nothing visibly changes. Please be mindful of the fact that a ‘quantum leap’ is the smallest possible measurable state change possible. Marketing depts ate it up, but it’s the most ironic phrase ever, in this oligopolistic world in which most innovation is about branding.

1 Like

Don’t have much context for this ‘leak’, treat it as speculative for now.

2 Likes