Nvidia RTX 20XX Thread

Yeah so basically the GTX 2080, I believe is MSRP 800? So essentially what they did was take the GTX 1080 TI performance bump it up from $650 to $800 what about a 6% performance boost. You also gain the cores and whatnot, not exactly a value bargain.

2 Likes

Time Spy runs in 1440p right?

Edit: Yes

Source: https://s3.amazonaws.com/download-aws.futuremark.com/3dmark-technical-guide.pdf

Here’s the direct results

Actually, now that I look at it, it looks a bit weird, but what the hell.

1 Like

Seems legit, I am about 90% certain that origin of the benchmark is Vietnam.

Seems it really takes a hit.

1 Like

CINEMATIC is the word you are looking for

1 Like

What I am looking forward to knowing is how the RTX 2080 handles 1440p ultra with ray tracing with Shadow of the Tomb Raider. I lost interest in the Battlefield franchise starting with BF1. Battlefield 4 was the last one I had interest in, and even then it paled in comparison to BC2.

That was peak battlefield, downhill from there.

On topic; I think the raytracing demos I’ve seen from BFV look amazing.
Hopefully this RTX launch will help make Ray tracing a thing.

If you listen to the talking they mention the technologies demonstrated were actually targeting the card gen prior to rtx, they hadn’t even implemented rt cores optimizations at the point of that demonstration.

…unless I was totally hearing it wrong :smiley:

HotHardware:
“What can gamers who own 1080 can expect with 2080 for current games?”

Tom Petersen:
“We could’ve done a better job on this during the public announcement. Turing is a beast. It will significantly improve experience on old game and rock it for new technology. We shared some data that shows a bunch of games, you’ll see the performance somewhere between 35 to 45% better at roughly the same generation. In most cases, if you are on high resolution and not CPU limited, Turing is going to crush it!

In 2 weeks when finally the embargo lifts, we’ll see if that is actually true. I hope so, but…

3 Likes

It does have faster DDR so that will help higher res. I also want to see the real reviews

Not a gamer so don’t give a monkeys but I know bullshit when I hear it or even when it’s not there to hear.

Not sure i understand part where radeonrays is not comparable to rtx; Those 2 are completely different, and accomplish different things. The radeon rays *sdk, is for rendering whole scene and appling sequentially draw buffer as light paths are computed; while RTX is closed tech, its hard to tell what its doing. But it looks like raytracing is not applied to all objects, shadows reflections and refraction. (judging from demo’s nv shown.)

I’m not even sure how they came up with their Gigarays/s shtick; as that depends highly on scene unless as i mentioned its limited to certain lights, objects and shadows only - not whole scene.

// i have a feeling they also use their ‘ai’ software for picture reconstruction to merge rasterized render with raytraced rays, so it looks seamless. (and most likely its using frustum tracing.)

1 Like

All Radeon Rays does is calculate ray/geometry intersections. The programmer sets the ray’s starting position, direction, length, etc… Radeon Rays then calculates the position where each ray intersects a mesh. It does does not however do any rendering. The programmer can use it to simulate light, sound, physics, occlusion or whatever else they want. RTX is similar. From what I could glean from the planned Vulkan extensions (http://on-demand.gputechconf.com/gtc/2018/presentation/s8521-advanced-graphics-extensions-for-vulkan.pdf) all RTX does is intersecting. It is then up to the shader (read “programmer”) to do with the intersections whatever they want. This makes RTX, Radeon Rays and Embree quite similar. (OptiX provides some rendering functionality as well.)

Indeed. There is a paper on OptiX though if you’re interested. They are likely very similar in their implementation. My guess is that they’ll even be merged at some point.

http://raytracing-docs.nvidia.com/optix/whitepaper/nvidia_optix_TOG_v29_n4.pdf

Somebody at nvidia said “The bigger the better”.

Yes, they use AI to denoise the raytraced image. “Merging” the rasterized and raytraced part is likely just done by summing the pixel values.

Here’s nvidia’s video on the denoiser: http://on-demand.gputechconf.com/siggraph/2017/video/sig1754-martin-karl-lefrancois-train-your-own-denoiser.html

3 Likes

Why does the entire line-up of 2000s have usb type-c? What is the point of a usb port on a gpu?

https://www.anandtech.com/show/13268/custom-geforce-rtx-2080-quick-look

I could only guess VR. /shrug

yeah so we pretty much agree. In code for radeonrays it specifies to event->wait() after it finishes calculating a ray pass implying it waits for all rays in first pass to be calculated first before next pass is calculated, ~ in terms of render i meant the amd’s sdk they supplied with renderer already compiled which renders whole scene using raytracing render rather than rasterization or mix.

If someone says raytracing, as old 3dsmax user i expect the whole scene to be rendered using raytracing rather than certain objects/lights/shadows. (on the bf demo its clear they only use raytracing on particle effects, and save up performance on all reflective surfaces by using vortex shading combination with lightmap.

I kinda laughed hard when they shown player’s model being mirrored on the window; stating its raytraced… as light reflected in on the gun model wasn’t shown but only external ‘server’ model was shown instead. Thus render was likely just a vortex lightmap instead - as if raytrace render was used rays would represent players model scene instead. (in terms of the reflective water on the ground, black desert ultra graphics does something similar and its still just vortex shading.)

this is good example.

1 Like

Looks like screen space reflections to me. The game walks along the rendered frame pixel by pixel until it finds an intersection, then just copies that pixel’s color. While this technically constitutes ray marching it’s very different than what RTX does. Screen space reflections also obviously cannot reflect anything that’s not on the screen, while true raytracing can reflect everything.

I’ve written a bit about this in my The case for raytracing [Not Tom’s Hardware Edition] thread.

1 Like

i agree, and i don’t argue about that (ssr is accomplished by using vertex shader projection * model view matrix * vertex position you can add a light there diffuse or lightmap and obviously filters and other nice effects coming in with unified shaders for ambinet mapping) -

Its just my impression of the demo stating ‘look at all the light being reflected and calculated…’ and at times they stopped to tell everyone that this is raytraced while obviously isn’t (as in another frame with rtx off, they walk and the effect is still present.)

image

1 Like

I’m rewatching the demo right now as I’m not sure what exactly you are referring to. It seems what confuses you is that nvidia only calculates a single bounce of light. Keep in mind that the photorealistic offline renderers use path tracing specifically, not just ray tracing tracing. Every path tracer is a ray tracer, but not every ray tracer a path tracer.


Unrelated to the raytracing I believe you are conflating a couple things. SSR is purely a post processing effect so it doesn’t have access to neither the vertex position nor model view matrix. Lightmaps don’t work for reflections either. The reflections are likely all SSR + environment mapping as fallback.

1 Like