Return to

Nvidia RTX 20XX Thread


This Topic is about the GeForce RTX Hardware, not the RTX middleware.
For the RTX Raytracing middleware see: RTX Ray tracing

All the juicy details you’ve been waiting for:

The pricing is exactly what I expected from Nvidia without current AMD Radeon competition.

What’s up with Tensor Cores:

You may have heard lots of pr speak by Nvidia in regards to tensor cores, this something I have a particular gripe with and want to clarify a bit.

Each ‘tensor core’ is just a matrix multiply-add unit that performs operations on small 4x4 matrices.

Nvidia themselves explains it here:

As an example a tensor core can perform 1 matrix multiply-accumulate operation per 1 GPU clock. Example: it multiplies two fp16 matrices 4x4 and adds the multiplication product fp32 matrix (size: 4x4) to an accumulator (that is also fp32 4x4 matrix).

Nvidia calls this mixed precision compute because input matrices are fp16 but multiplication result and accumulator are fp32 matrices.

Thensor cores are dedicated 4x4 matrix fmadd (Fused-Multiply-Add) units. But nvidia marketing had to call it tensor cores.

Feel free to post articles and new stories as they appear, tag me if you want it included in the top post.


I want to be excited, but until there is a good open source driver, I’ll save my money. Nvidia makes good hardware, but their hostility towards the open source community and virtualization makes it hard to justify.


there’s already a thread for this IIRC


Let’s keep it on topic, I bleed open source and run Multiple radeon cards for various purposes.

But this topic is just about the Nvidia tech announcement. :slight_smile:


This is about the actual RTX series GPU announcement.
The other thread is about the RTX middleware.

Also the Vulkan equivalent works on everything with Vulkan support, 4x4 matrices are nothing special to a GPU, and one can still do raytracing on a Vega 64 for example, or 1080 hardware etc. Just not as fast.

See buckets of resources starting here from this comment onwards:


ah fair enough

isn’t the synopsis just “wait for benchmarks” for the ga(yy l)m(ao)ers among us


This thread is meant to grow to accommodate that as such information becomes available. Which it will in due course :slight_smile:


It’s much more concise than “floating point array math units” because they’re designed to do functions that machine learning applications need acceleration for.



Personally not a fan of the cooler design. Too symmetrical and safe for my liking.

Would’ve been nice if they carried over the tri patterns from the Titan XP-style coolers. Had a window in the middle, or something.


This generation just priced me out of my hobby will probably be skipping till next gen for the first time since i’ve owned a PC.


I think this is one of the first flagship cards in a while to have a good design language. Daesn’t look like a toy RC car, which is a nice change of pace


Still says GeForce, whatever that’s supposed to mean.

And people give AMD crap for their naming schemes :joy:


should it not have any branding marks on it? what?

I was talking about the industrial design, not the naming scheme


The branding is part of the design.


to an extent, yes, but only the style of said labelling, not the content. The designers don’t control the marketing department.

I’d rather not have LEDS, but the name is completely immaterial.


Ray tracing technology is pretty impressive and I do give credit to nVidia for trying to push this technology. I think it does have a lot of potential to make gaming and 3D graphics in general much more immersive. However this series and the event was kind of a mess.

The launch event was very tightly focused on ray tracing performance rather than general gaming performance. Nor did nVidia go particularly detailed in the specs of these cards at the event proper rather posting them on their site. They also obscured a lot of the numbers or cherry picked data points. Such as the famous “2070 is 6X faster than a Titan X” headline I keep seeing everywhere.

Seems to me that these cards won’t be that much faster in non ray traced titles. IE the vast majority of them. Prob about a 15% increase across the board with higher power consumption. That isn’t particularly impressive especially for the cost they are charging for these things. The prices are insane. The partner cards won’t be much cheaper either. The new fan design and better VRM has taken a lot away from the partners and what they usually do. They will be forced to go higher end and compete that way. Leaked details from Gigabyte and others point to the same or even higher prices than the nVidia reference cards. Quite worrying. (Though apparently it is okay because these aren’t meant for gamers they are actually professional cards made for small businesses and self employed 3D modelers so the pricing is great for them and gamers just need to suck it up and deal with it because we don’t matter)

Also what we know about TSMC’s 7nm roadmap as well as the fact they are launching the Ti class now rather than waiting points to the idea that the 20XX series will be very short lived. It is highly likely that significantly faster 7nm based GPUs will be available early to the middle of next year making these cards irrelevant very quickly. Seems to be nVidia milking people now and then even further when the 7nm stuff drops.

It also raises the question of the mid range. With all this extra stuff on there these are big GPUs. To get lower prices we need smaller dies and that coupled with some leaks which show the 60 series and below being still GTX cards points to a huge bifurcation of nVidia’s line up with some having ray tracing tech and the others not. This could be seriously damaging to the GeForce and GTX brand or at least confusing to the average consumer. It also makes me question how much actual adoption we will see. Yes it is baked into DirectX and Vulkan now but if AMD cards and half of nVidia’s line up don’t support it particularly well why even bother?

So to sum up basically still a lot of unknowns.

What is the role of the AIB partners this generation and how will they price their cards and what features will they add?
What is the roll of SLI and NVLink as they seemingly have a bit more of a focus this time around?
What games and applications will leverage these extra features?
How will the community react to the price, segmentation and large change to the Geforce/GTX brand?
What is the actual performance and power consumption in non RT focused titles?
How will nVidia move forward from here?

Will be watching this cycle with interest.


The whole thing is kit ‘n’ kaboodle, part and parcel; you do not get one without the other.

At any rate we could argue all day about preferences. I think the design is fugly, it’s overly simplistic with poor contrasting areas. The previous generation Founder’s Editions were iconic, even if I wasn’t a huge fan of those either. At least they looked tech-ie. This looks slapped together to me; too plastic in its execution (granted all we have is renders right now though, so that doesn’t help).


Thank you for the quality response @DerKrieger, love seeing well written opinions like this. :blush:
More eloquently put, than I would have.


I want my graphics card to scream (silently) that it has more power than an out of control toddler who’s just eaten a large pile of sugar. Its looks must resemble that of a box which has been repeatedly suplexed by said toddler.