Nvidia RTX 20XX Thread

in terms of SSR you actually compile vertex shader to do it. (look up code.)
here’s small example of SSR shader-- https://pastebin.com/EkHhfQA1

There are plenty of hacks you actually use to optimize some reflection shaders, as you want to save on performance. So you implement lightmap with specularity, objects, skybox and further away or more complicated objects; and you pretty much compute only particle effects and changing models while other stay in ‘ready’ state.

lightmap in itself doesn’t need to be static image like in its first implementation in quake (it could be 3d scene or object - and viewpoint is calculated by vertex shader and used for SSR.) – if you wanted to play around with technicalities you could say SSR, ambinet occlusion, global illumnation any many more are just lightmaps or vertex shaders hehe :wink:

Yes of course, because there’s no other way to render in OpenGL. But the vertex is just a flat Rectangle and the Vertex position thus only the 2D coordinate on the screen. It’s really just calling the shader on every pixel in the framebuffer.

Notice how the code you posted gets the position from a texture. The vertex position is meaningless:

uniform sampler2D gPosition;

Well, kind of. Reflections are in part done with cube maps, environment mapping, reflection probes or whatever you want to call them. Light maps don’t work however, because they look the same, no matter from which direction you look. This makes them unsuited for reflections because these obviously depend on the camera position.

You could calculate the light map in real time, but that defeats the whole purpose of using them in the first place. You’re better off using something like Spherical Harmonics / Spherical Gaussians and updating htose.

Object complexity doesn’t matter either, but we’re getting away from the original topic :smile:

ikr

just wanted to let you know you can still do lightmaps in real time but without taking all objects into account (thats how its done today - in most cases with SSR, or reflective objects, i have lunch at the moment so maybe after work i can use some game to show it as example – maybe first farcry with env bump mapping water would be best example of that).

1 Like

Looking forward to it. Let’s start a new thread though :rofl:

Taking guesses here:
Displayport and HDMI both support beeing shot over by USB-C. Having a single connector to plug in your VR-Heatset would be pretty neat.
So probably that?

1 Like

What do you guys think this means for the 1000 series cards? Any clue as to when and if those prices will come down further?

I agree with many above, these prices are too high for me to be interested. Don’t really care about bleeding edge stuff

Nvidia & AIB’s still have tons of 1000 series chips to sell. And they’ll trickle their way through the market.

Expect some price cuts and people flocking to get those 1080’s they previously couldn’t afford.

1 Like

Historically, the previous Gen Nvidia prices do not drop much, people thought they would with both the 700 and 900 series, but it never happened.

1 Like

Bleh. Time for a new hobby. This shit is getting stale

2 Likes

USB-C is the new displayport connector. It’s just USB, type C does USB, Thunderbolt AND DISPLAYPORT.

I may have liked PhysX but sorry I am not going to pay more for something that gives a more cinematic experience. In fact in all honesty until I bought a GTX 1080 Ti I never bought an expensive card and this did not pay more for proprietary stuff. Particle effects matter to me. A game looking more cinematic does not matter to me and definitely not at the prices they are stating. Also they announced prices and then change them a day later. Just sickening.

Exactly.

If they’re already cheaper than the new cards (they are) then why would they drop in price?

Also, previous generation cards won’t be on sale for very long in the new card market.

Hothardware have a 1 hour interview with Tom Petersen covering RTX. Im only 20 min in but there are a lot of nuggets on how it will work and plug into games.

I know it is a late reply but physx was a pain the the ass for years. It RTX hangs on that long this will suck.

I can see ray tracing being the prefered rendering method, sometime in the future. Judging by Nvidia’s own performance results, 30 fps, I dont think the hardware is there yet to support the technology.

In my mind, that means that game devs are going to have a choice as to whether to program for a technology that makes their games look better, but perform like shit, or stick with the tried and true method of programming, that requires no extra effort, and make their games perform well.

1 Like

I think ray tracing will be very, very niche for many years yet.

At the end of the day, what is the most popular game type for a 3d card user?

First Person Shooter

And what do people live and die by (literally in game) with FPS?

Frame rate

No matter how fast the thing is, people will take 144hz with no ray tracing over 30-60 fps with ray tracing every single day of the week for these games.

Are you really going to notice the reflection in that enemy’s eyes during competitive FPS? Fuck no.

Also - at the frame rate these games are played with you could likely drop the resolution to 480p in the screen area 4 inches away from the crosshair and the user wouldn’t care… early PC demo-scene stuff taught me that years back. If you run fast enough you can drop details significantly and the movement is too fast for people to notice or care.

In fact, i’m SURE that gran turismo 6 actually DOES THIS on the PS3. It cheats when rendering background stuff that is “out of focus” on replays, etc.

Will the average gamer pay to get ray traced 1080p60 (with dips below that) when they could spend 1/4 the money and get > 1080p60 (like, 80-90 fps) with rasterization only, that looks almost as good?

I don’t think so. Not until there’s some better use case than the occasional reflection in a scene (that can usually be done well enough to be “believable” at high frame rate with rasterization hacks anyway).

I definitely agree that ray tracing is the holy grail, but it’s going to be a pretty tough sell until the hardware is there to do much better than this at much lower price.

1 Like

There is many use cases for ray tracing other than rendering pretty effects. And because they don’t have to be done for every pixel they’ll be plenty fast.

They also live and die by realistic sound. How do the best sound engines work? Ray tracing.

Very popular FPS with equally obvious reflection problems that could be fixed by ray tracing:

The average gamer doesn’t play on a 144Hz screen either and doesn’t care about FPS unless they drop below 30.


You folks have to realize that there is no “rasterization versus ray tracing” going on. The industry doesn’t have to “switch”. It’s “rasterization plus raytracing”. Ray tracing can be added to existing engines and their developers will find ways to integrate it without killing performance.

I’m not saying it is all or nothing. My “either/or” was for raytracing plus rasterisation vs. vanilla rasterisation as demonstrated with the significant frame rate hit by Nvidia.

i’m saying that getting people to pay for it for fixes for non-game breaking and barely noticeable rendering issues is going to be a hard sell. Same as 4k gaming being a hard sell for most people currently.

Faced with say 4k60 or 1440p144 it’s a no brainer for most.

And if it doesn’t sell in volume, no game is going to use it for any serious effect.

We’re WAY WAY off it being cheap enough to get any serious adoption or use is all i’m saying.

No, rasterization isn’t perfect and has plenty of issues. But unless you’re actively looking for them they aren’t normally noticeable in actual gameplay. And even if they are, losing half or more of your frame-rate to fix them is an easy “NOPE” trade off - at least until the hardware is MUCH faster than this.

My “average gamer” comment should perhaps be “average mid-high end GPU buyer”.

The “average gamer” is a console peasant or 1060 buyer and the low end 20 series cards are definitely not going to be capable of this stuff, so for the purposes of discussion they are irrelevant.

Stuff like raycasting for sound has been possible for years already at this point. 20 series not needed for that.

I agree that the demos were very bad. This is just another reason not to use the as performance indicator though.

I think they are very obvious in the screenshot above, in what happens to be the biggest game of all time.

You can’t use nvidia’s demos as performance indicator. These are extremely early and unoptimized software. An unoptimized rasterizer wouldn’t be any faster. These implementations are obviously extremely basic:

[…] given the short amount of time they had to work with the new GeForce RTX cards (roughly two weeks).

For example, currently the resolution for each ray is the same as the chosen internal resolution.

While triangle count apparently doesn’t truly impact RT performance in Battlefield V, the amount of singular ‘instances’ drawn on the screen (for example, doors, windows et cetera) does have a sizable effect. DICE is investigating the possibility to merge the instances into the same structure, which could lead to an almost 30% increase in raytracing performance.

(https://wccftech.com/ray-traced-battlefield-v-runs-sub-30-4k/)

… nevermind that you’d only have to trace half as many rays in engines with temporal anti aliasing.

You are comparing a quick and dirty prototype to engines that have been optimized for years.


… causing a significant burden for CPU/GPU which caused sound engines to trace less rays than they should for best results. But that’s not the point.

Physics simulations can be sped up by raytracing. Triangle counts can be increased, because ray tracing is faster than rasterization with a lot of triangles. Lighting can be updated dynamically, removing the ugly light bleed from prebaked light maps. Popping LOD transitions can be decreased or even removed. Bots can use raycasting to determine whether an enemy is visible for them or not.


Don’t let a few bad demos bias your view of what raytracing can do. Nvidia needed something visual and they needed it fast. Their demos are very badly made IMO.

See, maybe i’m just older and more jaded, but given Nvidia was using this stuff as a showcase for the new card capabilities, one would hope that they had their shit together.

There was no pressure for them to get 20xx out the door with an oversupply of 10 series cards in the pipe.

Thus, i suspect this isn’t some half-assed worst case performance scenario, i’d suggest it is representative of what we will see, performance wise.

Otherwise, why would they release?

In the absence of any other information, i would take the Nvidia demos as BEST CASE and most definitely not WORST case for performance.

The whole 20 series release has been full of misleading graphs, misleading performance claims, backtracks and other shenanigans.

I agree ray tracing is the future, but i think the 20xx implementation is half baked.

Wait for the 7nm GPUs next year, don’t buy into this v0.9 stuff at massive expense. Definitely not before there’s a use case for it.