Nvidia RTX 20XX Thread

Neither Tensor nor RT cores are used for existing games with the usual purely rasterized rendering.

1 Like

I’m waiting to see what those Tensor cores can do in Milkyway@home and Enstein@home.

Adaptive Temporal Antialiasing (ATAA) maybe could happen quickly to all sorts of games like my trusty Warframe

3DMark should also have that as option with new benchmark, so “we” could toggle that on and off if there is nothing else to test

If I was him and wanted to save the site, I would publicly declare the fuckup as a fuckup and announce to never write an article again.

1 Like

Jensen Huang’s genuis idea:

I was working on this only to find out someone else beat me to it. Also sorry for posting this…

2 Likes

Except he’s talking about computational tasks in a professional enviroment. Not for gaming.

Moneysplaining with super mysterious tone which eventually locks permanently on

*cough *
I get this unbearable feeling from all small talk type word salad, that I want to get out right now, and/or temple veins bulk up and I’m irritated and angry, which some people manage to cause immidiately, but Jensen is borderline neutral - irritating :smiley:

Looking at the “just buy it” article from Tom’s, one can not be so sure about that.

The just buy it article is a stupid opinion piece from Tom’s.

1 Like

Opinion piece or obvious advertisement. We will never know.

1 Like

It doesn’t seem to be written well enough to be an advertisement. That thing is all over the place…it seems like fanboy ramblings.

Or somebody fawning and gushing in the hopes of getting some scraps from nvidia

1 Like

i think its closer to look at me i am willing to sell my reputation in an opinion piece please do not drop us from getting free shit k thanks. ( had to be made after real piece said nope do not pre order)

There is more shit going on with TH and NVidia. This is from their “history” slideshow:

How the fuck do they know its a quantum leap? How do they know its viable? And Tesnor cores, we have to have Tesnor cores.

1 Like

AFAIK it is the smallest leap possible.

These are BIG dies though, this is why the pricing is high.

That said:

  • headline feature is not used in any current games
  • headline feature is not going to be a game breaking thing if turned off
  • without the headline feature, the performance vs. previous gen is rumoured to be only 15-18% better
  • even nvidia’s own mainstream cards (2060 and below) won’t support headline feature
  • none of the consoles will support the feature, so games won’t make critical use of it
  • actual visual impact of headline feature will be minimal, we’ve gotten by with rasterization just fine so far
  • headline feature has massive frame rate impact and i really don’t think this v1.0 hardware is worth using it with, especially given the above points. Anyone playing FPS for example would MUCH rather say 144hz 1080p or 1440p than 30-60 fps with a few barely noticeable, non-game-breaking-if-they-aren’t-there visual effects.

And yeah, for those reasons i just listed above, the Tom’s “just buy it” editorial is extremely disappointing and dubious advice from what was previously a long term reputable site.

Also interesting - AdoredTV did a video recently in which massive hits (10-15%) to performance were shown on Pascal (well anything pre-turing) for HDR. AMD doesn’t care. Looks like when HDR is a thing, the AMD cards are going to age much better.

2 Likes

Well not only is the just buy it such a shill article but you also have what I posted above in their “history” slide show which yammers on about something that is not even released. Looks like NVidia bought TH hook, line and sinker.

1 Like

Yeah, looks like it.

At least, one would HOPE that they got a big pay out for the massive hit to their credibility that this has incurred.

If they didn’t and it’s purely some NV fanboi editor they have on staff, well karma’s a bitch…

1 Like

dunno

am kinda curious to see if people try to analyze what nvidia is actually doing with ‘ray tracing’

if had to guess would be some significant cost savings in sorta algorithmically picking when to ‘ray trace’ when to just only rasterize, maybe for alot of the ‘ray tracing’ just do that ray casting thing, like if its not a reflective surface only count the first ray not recursively, or if they do anything special for shadows etc

but looks like more of the similar shit started to annoy me when dx10 first hit, bunch shit got too reflective and looks unrealistic just all these people waxing their foreheads and shit like some kinda comical bald mall, rocks/bricks etc reflecting light like its wet or somethin

will have to see if it gets used in nice semi realistic looking ways

I believe they’re doing ray tracing only for some surfaces and they’re also firing limited rays and then using the tensor cores for intelligent de-noising (so that the very limited ray firing doesn’t look like shit).

AMD also have their own real time ray traciing coming (and support is going to be in vulkan) so i’ll be very surprised if this nvidia stuff takes off, as it will not work on any of the consoles (which are all AMD), AMD PC GPUs, etc.

It will be exactly like PhysX.

A niche thing that nobody really cares about but Nvidia can try to make a big show about.

the AMD ray tracing side:
https://www.game-debate.com/news/24748/amd-hits-back-with-prorender-real-time-ray-tracing-support-for-vulkan-api

edit:
Nvidia RTX is DirectX only at the moment (?). AMD is Vulkan and thus will work with Linux.

2 Likes