Nvidia RTX 20XX Thread

Did a little bit of reading, Looks like RTX is Nvidia’s implementation of DXR with the addition of extra features.


So while applications designed for RTX may not work on AMD cards, applications implementing DXR running on Nvidia hardware will make use of the RTX pipeline.

What I’m interested is how complicated the ASIC’s are in design. Are they just generic ASIC’s designed for handling ray tracing or some Nvidia special sauce core making use of ray tracing ASIC’s

3 Likes

I also wonder if Nvidia would license the technology to AMD.
Competition is good and maybe in a years time, the first generation might be licensed to AMD allowing for competition in the market.

Same thing happened recently with Intel licensing AMD graphics processing units and in the past, the Xbox 360 used a cut down version of the PS3 processor.

Another thought, RTX card, designed only for ray tracing. License out the NVLink connector and Nvidia can sell cards, designed specifically for adding RTX functionality along side any card that makes use of the NVLink connector.

Highly unlikely but would allow for AMD to remain competitive if they don’t have their own solution while potentially increasing income for Nvidia.

People who buy AMD or Nvidia cards are unlikely to switch at this point, brand loyalty is too strong in this market. But providing a means for consumers to gain access to the RTX pipeline while taking profits from their competitor through licensing would make a lot of sense.

Power limits seem to be a substantial block for how these cards overclock. The +23% is a pretty shallow limit - enough so that the +30% of EVGAs cards (according to GamersNexus) made a sizeable difference. This is weird because the power delivery hardware of the reference boards is massive compared to any other reference nvidia design - and capable of delivering way more than the power being used.

These RTX cards have essentially 3 “Sectors”: Cuda Cores, Ray Tracing, and Tensor Cores.

The benchmarking being done right now is only using 1 out of 3 “sectors” of the RTX cards. This means that all the allowed power is currently being used for your standard GPU rasterization

But what happens when the card starts to utilize the additional features of RTX?

If the power limit is hard set for the entire card, that means if we start Ray Tracing, the power required for that is taken up and not able to be used for rasterization. This compounds if the tensor cores are used.

Either this is wrong, and when RTX silicon is utilized, the overall power limit increases, and thus heat output goes up, OR when ray tracing and/or tensor processing is used, the rasterization ability of the GPU significantly drops. I wonder if this is part of the reason that even 1080p/60hz seems very difficult to achieve even on the 2080ti

1 Like

This is definitely a possiblity.

I was surprised by the power consumption numbers. People were pleasantly surprised that the new cards weren’t that much more power hungry than the old ones… I bet that changes under a ray tracing workload…

I wasn’t really.

Don’t forget that vs. Pascal, these cards have more cuda cores, plus some other specialist hardware (tensor cores, rt cores(?))… on essentially mostly the same process tech. Laws of physics haven’t changed in the past 18-24 months…

To be honest, i’d avoid 20xx entirely, and wait for 21xx which will be on 7nm and might actually have some software out by then to make use of it. If you’re on something older than Pascal or high end 9xx, buy Pascal (or AMD) much cheaper for now, and wait for the 7nm implementation of this technology. I don’t see ray tracing being anything to write home about until the RT cores come down to a mass-market price-point and these just simply aren’t…

I personally think 20xx is going to be a relatively short-lived stop-gap card while nvidia is waiting for 7nm to be ready next year.

1 Like

Performance definitely didn’t justify that price tag. What a useless series. But with no competition on the horizon, they can do what they want.

Will be interesting to see how 2019 plays out.

Navi is rumoured to be 1080+ performance in the 200-250 dollar price segment.

I think Nvidia are perhaps planning to hold firm on pricing/keep prices high to make bank before that comes out.

Which is why it is extra weird that the power delivery is so beefy on these. Now it’s totally possible that that’s just a “look at me” thing they did or possible to increase the stability and temps since the heat output is spread across more surface area? Hard to say.

Either way I am very much looking forward to how this all plays out. It’s exciting to have a new technical direction in the GPU world, I just hope that it goes somewhere and AMD is able to step in and play ball

Personally i think there would be a market for a seperate “RT” card.

Keep your existing GPU - sell the add-on card with a shit tonne of RT cores on it (i.e., dedicate the entire die to RT cores) to work in tandem with it.

They’ve mostly killed off SLI, people likely have plenty of free PCIe slots for another card… whether they’re team green or team red.

I don’t see Nvidia licensing anything. AMD will have it’s open source version (probably inferior) and implementation witch will be accessible to other vendors like Intel dedicated gpus.

Then we will enter into a true Console like PC era. The GPU you will buy will dictate witch game you can play.

1 Like

The cards completely hinge on that DLSS and Ray Tracing technology, if we see poor adoption rates then its going to be quite sad.

IMO DLSS seems to be the only thing of real use with these cards, I assume it uses the tensor core to process the frames for upscale…

True. Is sad its launched with a nothing burger. I believe it will be a month before windows ships their software and nvidia plugs into it.

DLSS uses the same silicon as sharders do . So it’s still a matter of losing shader performance to have DLSS running on the final frame.

So the 2080TI may be the same as a 1080TI with DLSS on but look much better because the 2080TI will have to dedicate a portion of its compute to run the tensor AI - DLSS.

All still theory and no RTX to benchmark…Paper launch so to speak.

In have not read on yet but Jayz was saying that that RT part of the RTX was Vulkan based from nvidia…

It was a one line throwaway remark, nothing to back it up but it does not sound like the nVudia we all know and despise. An open project and they actually want to improve and support it rather than get inside it and loot for their own proprietary purposes?

So just for fun, I decided to search for a Windows 7 driver for the RTX series, and the PUBLIC gets one, but the press didn’t.

https://www.nvidia.com/download/driverResults.aspx/138068/en-us

This is why TechPowerUp misreported that there’s no driver for Windows 7, 8, 8.1 for RTX, when most product pages specify it actually exists.

1 Like

It is an interesting situation. I wonder if that will be the case with it. I know two of the three sectors can work independently and at the same time so when that comes into effect the power draw should go up, I forget which two though. The third is shared with the shaders. It might draw a different amount of power but as far as I know right now the card has to shut off one sector to start the other in the shared region, so power the could go up or down, I have no idea about the other parts draw.

But is that even useable? If the card relies on DX12 or parts thereof, win 7 and 8.1 don’t get that. They stop at 11. So surely in this case the RTX card just becomes a monstrously expensive GTX, I am assuming it will not be able to use the other parts that make it and RTX if the API simply does not ex8st

Vulkan.

You’re forgetting Vulkan.

3 Likes

Are nvidia invested enough in that? And it is a similar problem to the RTX games. Not a whole lot of them out there. And you are still paying huge prices for something you can’t really use properly.

Just seem like a colossal waste. But so does a lot of things with these cards.

Vulkan is being heavily invested in by Nvidia through their willingness to work with Steam and DXVK.

Vulkan allows RTX to run on Windows 7 even if you have no DX12. Nvidia pushing for Vulkan for RTX is very good, especially for Linux gaming.

3 Likes

Nvidia need all the help in the world. Even if 7nm next year makes RTX cheaper. AMD and 2020 intel will have lean shader only cards that make numbers in FPS per dollar to make buyers look.

Going custom with 2 competitors greedly after the same space may be hard. Even for Nvidia.