RTX on GTX cards

Nvidia overestimated their own marketshare. They thought that since AMD isn’t competing with them in the higher end market that they’d, not only sell well, but also migrate previous highend consumers (Pascal) to their new RTX line of cards. This April driver update is their way of luring the previous gen card owners to their RTX cards basically.

We’ve already started seeing efforts from developers implementing raytracing without the reliance on dedicated proprietary hardware. This demo runs on Vega 56.

This definitely does …

3 Likes

What I mean is that there is no need for the dedicated hardware. Crytek and now foolishly nVidia themselves have proved as much. It is not going to be fast, new stuff never is but now it is known that this is just software and does not need the RTX stuff at all.

I am interested to see now how the preform, and if anyone will do a comparison of the nVidia DXR stuff compared to just native Crytek. If the Crytek one runs at a reasonable speed or in any way faster than any of the nVidia stuff on the 10 series stuff they will have shot themselves in the foot by outing that their solution was just software designed to sell overpriced cards, when it could have been just software all along.

1 Like

Hold on there. The scene in crytek’s demo was almost entirely static. You could do most of those reflections with old school environment mapping. We simply don’t know what shortcuts they’ve taken to make any conclusions.

It sure looked impressive, but from a technical standpoint they didn’t give us much to work with.

What’s the problem with dedicated hardware anyway? You also don’t need dedicated texture units, but they provide an easy way to speed up rendering. That’s a good thing. Same goes for raytracing.

1 Like

Yup… Just like PhysX… You can use hardware or you can just use Havoc and be perfectly fine…

And Nvidia uses machine learning based denoising because their ray tracing uses so few rays. We are not here to count limitations.

Dedicated hardware is okay. You just have to keep in mind that software changes, hardware that is in your system does not.
If Nvidia had put out a dedicated RTX-card (like the early PhysX-cards), this would not be as bad. As is, any changes later on to how raytracing happens in software may render the 20-series worthless in that department.

This is critical though. Here’s a realtime raytracing video from 2016:

Does this mean we could’ve used this technology in games all this time? No. Because their implementation has too many limitations. Same goes for crytek. We need more information.

True, but this is the case for all graphics technology. If the way tesselation happens changes current hardware is useless in that department. Same for texture filtering, texture compression, etc.

Nothing g in particular just this time nvidia them selves have proved they even they don’t need it for their own supposedly tailor made solution. So why does it exist?

Speed is not a great argument as even with the dedicated hardware it is not blazing fast by any means.

What I am really waiting for is the comparison now of the 10 and 20 series. How much of a gap is there.

It’s too early to say. The RTX demos were obviously rushed and poorly optimized. They shoehorned ray tracing into engines that were never designed for this. This was a stupid decision by nvidia imo.

Crytek on the other hand has shown very little and is not doing path tracing. We simply don’t have enough information to go on right now.

Everybody’s getting upset without having any data.

1 Like

Neither is nVidia as far as I know, they are just doing fancy reflections.

Edit: the like in this case are that I agree with all else you have said.

Sharp reflections don’t cause noise. At least some of the demos were path tracing.

I don’t know enough to counter it but on the thought.

If they are only using so many rays, all from a point source, there will be gaps as the spread out moving away. The further the reflection the larger the gaps. This would need denoising for sure but still not be path tracing.

I have not looked at everything the said about it during launch but I never heard them say path tracing any where which I am sure they would have to further one up everyone else.

1 Like

Your logic is right, but the rays are shot from the camera, not the light sources. So there’s a sample for each pixel and thus no noise.

Path tracing is technically also a kind of raytracing so people often conflate the terms. What matters is whether they simulate more than just one specular bounce, because those are easy to do (and fake).

1 Like

Makes sense. I was thinking of the eye as the point but yeah the camera would be a field rather than a point.

The problem with path tracing is that every bounce of any ray requires new calculation making it nearly impossible to do in real time (wich is why Nvidia does the denoising).
Ray tracing is much “cheaper” compute wise as you do it backwards from the camera through a raster-plane, bounce of an object and see if it can reach a light source from there.

But what if a pixel traced from the camera bounces into the shaddow of an object? That is a black pixel and could be part of noise.

Pretty much :+1:

I’ll just add that noise is very common in computer graphics and there’s many ways to deal with it. This is exactly why engines need more time.

There is no randomness in raytracing, so all neighboring pixels would also hit the same shadow. You would see a weird reflection, but no noise.

Navi is still GCN. It is a cheaper Vega replacement. Don’t expect more performance than VII which barely competes with the 2080

It won’t be till post GCN. Sometime in 2020-2021 AMD may release something sort of competitive

given the relative excess of compute capability on vega (vs ROPs) it would not surprise me to see some form of ray tracing work on it a lot better than non RTX NVIDIA hardware.

I imagine this is what people said about the Model-T, the semi-automatic rifle, and birth control.

Thank God companies like Ford, Colt, and Nvidia push through the criticism. God bless them all.

1 Like

When you bought a Model-T you simply added gas and used it. When you bought a semi-auto rifle you bought ammo and used it. When you bought birth control you used it. When you bought an RTX card that “just worked” you waited. You waited for Windows support. You waited for NVidia graphics drivers supporting every feature and you waited for games to support the feature. Now NVidia is still working on drivers, Windows has support but you still wait for a titles to support it. If titles do not specifically code for it, it doesn’t happen. RTX will not revolutionize gaming like the Model-T revolutionized vehicles, semi-auto revolutionized killing and birth control revolutionized sex.

3 Likes

You’re oversimplifying those products. There was an entire sales, training, and use case implementation for all of them. I would argue RTX is infinitely more complex than a semi-automatic rifle. You expect it to just work, yet it does, when I plug it in I can turn on my computer. Nvidia are not game and media developers. If most in the industry are lazy and have a poor attitude about it then yeah, Raytracing might not “just work”.

You’re confusing sales and marketing with initiative. Every tech manufacturer has their shortcomings on release. But to declare RTX a failure or imply that something has potential but no one wants to use it, you guys all sound like chariot drivers to me.

1 Like