Return to

Nvidia RTX 20XX Thread


At that price, the titan is (duh, like all of them are/were) most likely aimed at developers to have a card that is perhaps representative of more mid-high end RTX3xxx series performance for next year.

i.e., the developer can develop and run the new game on his titan and by the time the software is released that performance will be upper-mid-range in say 2020.

Well. Game developers, and those using the cards to actually make money with. Or gamers with way too much disposable income (of which there are a few).

As above, i very much suspect that RTX is going to be one of those things that WILL take off, but its going to be on the newer 7nm cards before its “worth it”.

Right now developers are dabbling with it in inconsequential ways because the hardware support isn’t out, and the hardware that IS out for them to experiment with on the consumer side still isn’t fast enough to do anything significant with.

i.e., RTX isn’t a “bust” per se. But it’s way early days yet.



You could call RTX the “Rekked” toggle. Go from 160 FPS to 60 FPS. What people do that :slight_smile:
It the stupidest setting in a while debate anti aliasing a few percent here or their but 1/3 the FPS. Thats clown town.

I know a sample size of one game. Not my fault.

1 Like


I kind of hope it is a bust, in the same way that PhysX for visuals was a bust. still used and runs physics for engines but the Borderlands 2 style of PhysX for purely visual stuff is all but gone.

I am fine with ray tracing staying around as part of engines and overall game betterment. But I hope RTX software dies or at least just becomes a card name for hardware aimed at ray tracing.



RTX in it’s current form of being more or so optimized for 23.976 fps delivery is mostly a VFX or 3D animation house’s best thing, because being able to rapidly prototype scenes in a movie means more time to focus on the movie, rather than thinking too hard on the IT backbone.

For games, too much of the tech has to be cut down to optimize for real time delivery. If you look at the Control teaser trailer, the noise on the shadows is indicative of one of the major flaws of using less light bounces and trying too hard to polish a *explicit word* for real time delivery.

YES, SSR has some major flaws cause it uses the player viewport and objects in it to make sometimes inaccurate reflections, but it will still take a massive compute power increase so that the Tensor cores aren’t still polishing the *you know what* noisy output. (Mythbusters weren’t allowed to say that word, I’m not saying it neither.)



I would have thought a sector as large as movie making would have all ready had their own specialised hardware for exactly this rather than wait for a gamers GPU to be created, and not even looking at the workstation and professional hardware.


Also getting off topic. Fuck 24fps, movies are nearing unwatchable with the high speed action now. Anytni g that helps further that acn fuck right off.

1 Like


30XX next year? Only way that happens is if the 2 series gets a shitty rebrand



Not really. Now that they have the base worked out all they have to do in upgrade and expand. The hard part is over for R&D, the cards work. Now its just a case of more is more for next year.

I will grant they make take a year and a half, maybe two. But unless the launch has been catastrophically bad they will push for more.



Turing is a long play, not a short one. Nvidia has spent billions on it, it ain’t going anywhere anytime soon.



Highly unlikely.
Without a magical tier process shrink, they will have to reengineer everything to split it into chiplets (like AMD is working on since 2012).

I don´t see the 30-series before 2020.



Price went down $500 from the previous Titan, so thanks nvidia!



Nvidia will go to 7nm just like AMD in short order.

Guaranteed: RTX2xxx will be a short lived series due to the oversupply of Pascal and imminent volume production of 7nm parts.

It may just be a simple process shrink (i.e., they won’t throw out the R&D they’ve put into it), but i highly doubt that Nvidia will be sitting on 12/14/16nm process for years while AMD is on 7nm.

If for no other reason than they would get totally owned in mobile, which like it or not is the largest segment of the market these days.

Also… by next year, at this point i mean within 12-18 months - not like… january. Of course they won’t replacing turing before say early 2020 at this point. we’re only 24 days away from 2019…

1 Like


Yes, they will go 7nm/10nm, but I suspect not till 2020 and I expect it to be 2021 at the earliest. Nvidia aren’t known for making such dramatic technology shifts; AMD on the other hand do. There were rumours that Turing was actually meant to be on 10nm but the yields were terrible. I suspect, Nvidia’s next architecture or probably more Turing refresh will be on 7/10nm but it will not be anytime soon.

We also have to remember that AMD are only just going 7nm in 2019. Vega instinct are cherry picked and so don’t really count.
Vega 2 would be nice since asynch compute is probably going to become a bigger thing now. DX12, Vulkan and Turing has all but guaranteed that AMD could simply have one chip doing specific tasks and the other another as opposed to both doing the same thing to boost traditional performance

1 Like


TL;DW: DLSS works well in terms of performance but overall looks worse than TAA. Also DLSS requires a bit of work upfront because the deep learning … well, has to happen before you can use it.

My opinion: So far this does not look like progress.



DLSS looks like I have to clean my glasses. Slight wash, colors mushed and text blurry.
On other scenes, DLSS makes the lines look like AA is off. WTF?

I would take 20% performance hit to not have flickering textures.



For something that “just works” there sure as fuck seems to be a lot of asterisks next to RTX.

1 Like


Whole pile of What The Actual Fuck nVidia going on now.

So DLSS (Deep Learning Super Sampling) is finally available in, drum roll please… 1 game. 1, one, uno. Good work for a start there.

So DLSS is garbage, visually. Wow good times nVidia.

On top of that you have to have DXR enabled for DLSS. You have to hit your performance just for DLSS even if you don’t want the Ray Tracing. Fantastic… But on the upside it is so bad you will never want to use DLSS.

Further, this is a good one. The 2080(ti) won’t let you run DLSS on 1080p if you so wanted that, BUT! The 2060 will run DLSS at 1080p… WTF!

So now even Jay is like fuck DXR, and nVidia are forcing you to use it for a feature that is terrible so no one will use either. Brilliant. 1 game and no one will even use the features!

Hahahhahahahaah nVidia have made an mess of the entire thing.



The AI cores on the RTX cards can only process a frame so fast. In 1080p it is disabled because the 2080(ti) would make frames to fast for the AI DLSS to process. Meaning it would be slower.

So it is disabled or every tech tuber would be showing turning it on made the game slower.

1 Like


Good lord, it’s worse than I thought.

1 Like


You know it’s bad when Jayz2dense shreds Nvidia



That makes a lot of sense. I wonder why then nVidia did not make that clear from the start.