Nvidia RTX 20XX Thread

Pretty much all renderers are written in C++, with C# only being used for scripting. If what you are saying was true all games would be CPU bound.

2 Likes

That is interesting. And from my understanding of the way the wind render process will go, it will sequentially do the ray tracing and the rasterizing (I forget which order). Now I guess they could switch the power between one in the next but that would delay the processing of the next frame. While they’re Ray tracing one friend that could be rasterizing another but then you’re using more power at once

I was implying the power draw was lower than expected. I expected much higher power consumption though we’ve yet to see the draw under a full load with the tesor and other cores doing work

1 Like

It’s likely because under a full day tracing load power consumption will be much much higher. We are only seeing a rasterizing load here. Lots of the die is idle. When that is running I’m sure power consumption will be much higher.

It also is a way for Nvidia to limit the appeal of aib cards

Also makes them look better against old reference gpus which is what a lot of reviewers pit the. Against and it will also likely need the cooling again when under rt load

2 Likes

True. Heat will be somewhat of a concern as well but then again, the current power load is only being dissipated from a fraction of the overall silicon, so if the power consumption goes up when RTX features are active, i dont think the temps will go up quite as bad as they would by simply increasing the Cuda power budget

I did not even think about the 2/3rds of the card that may or may not be idle right now. That is going to get very interesting when benchmarks allow for RTX features on and off.
I hope GN thinks of that and goes down the rabbit hole.

Nope. The case is true for professional render engines that are commonly licensed out like Unity or Unreal. But for a large proportion of mid sized game studios that decide to build their own engine (Think Grinding Gear Games), decide to use C# as the language for their game engine, including the render engine.

Also on top of this, games are not 100% parallel. If they were, you would have weird artifacts from some elements moving while others do not. Due to that, the best engines use asynchronous batching of game logic. This means the render engine is still waiting on the game logic however while one scene is rendering, the changes tot he next have already started to be processed. Great for reducing increased frame times.

However if you get into a lot of existing game engines, they still run the game logic then render. There is not real separation between the render engine and the game logic. Worse more so if the game logic is in an interpreted language.

But here is the key thing, for most games, this doesn’t matter. They can work with slower code because 1, they can reduce the frequency of updates and 2, even with the embedded speed from slower code, their game still runs at the minimum requirements they decided on. Modern desktops are insanely fast. We don’t realise this because so much of the code we run on our systems is so poorly optimised, they can be running at 1% of where they could be and the consumer wouldn’t care.

The important thing to remember is there is a difference between a bottleneck on the gpu caused by a cpu that’s too slow and a bottleneck on the gpu caused by delays in the cpu computation. The best engines get rid of this near completely, only adding a few ms to the frame timings. the average engine used in games? they can double the frame timings easily, depending on the content in the game.

1 Like

Maybe this is why they released before the RTX software is out?

Day 1 reviews will show less bad power consumption, great raster performance, the “future tech” of RTX being a thing for later - and the power thirst that comes later with RTX won’t be part of the review.

1 Like

DLSS is looking very promising

https://techreport.com/blog/34116/weighing-the-trade-offs-of-nvidia-dlss-for-image-quality-and-performance

If I put a lightbulb in my computer does that count as raytracing?

2 Likes

Like so?
https://forum.level1techs.com/t/post-your-tech-cringe/113501/35

1 Like

Some DLSS side by side, thoughts?

I think it looks pretty good

1 Like

Looks the same? Aside from a few surfaces here and there looking more shiny. It does give a noticeable fps boost, so it should be turned on always.

2 Likes

Im the worst person to talk about this as I will turn off all AA and just want FPS and smooth motion. It does look the same. Unless you pause and compare. human eyes are terrible and the brain makes up the world we see…Human actual visual focus is a small circle in the middle of your vision.

So DLSS is making the world look good and your brain parses it as ok if your really looking at that spot on the shiny surface.

1 Like

DLSS isn’t native 4k, but the other one is.

1 Like

True that 1080p AI upscaled is 1080p made pretty

1 Like

20 series incredibly power limited(cards like EVGA’s will have a second BIOS w/7% more power allotment, not much):

TL:DW - 20 series is incredibly power limited, unless you want silent and to play with water… don’t bother. EVGA’s card will have a second bios for another 7% power allotment gain but… its 7% more power, that’s it. You just flat out will not get much clock increase for the effort of water blocking. Reasons unknown at this time. Possibly to save power for RTX?

Will LOL if the reason it runs RTX at 1080p60 (barely) is because it throttles the raster side of the card so hard due to the power draw when RTX is in use :smiley:

Jayztwocents has gone crazy:

RTX 2070 was a GTX 1080 (performance wise) the whole time.