Return to Level1Techs.com

Nvidia RTX 20XX Thread


#346

At that price, the titan is (duh, like all of them are/were) most likely aimed at developers to have a card that is perhaps representative of more mid-high end RTX3xxx series performance for next year.

i.e., the developer can develop and run the new game on his titan and by the time the software is released that performance will be upper-mid-range in say 2020.

Well. Game developers, and those using the cards to actually make money with. Or gamers with way too much disposable income (of which there are a few).

As above, i very much suspect that RTX is going to be one of those things that WILL take off, but its going to be on the newer 7nm cards before its “worth it”.

Right now developers are dabbling with it in inconsequential ways because the hardware support isn’t out, and the hardware that IS out for them to experiment with on the consumer side still isn’t fast enough to do anything significant with.

i.e., RTX isn’t a “bust” per se. But it’s way early days yet.


#347

You could call RTX the “Rekked” toggle. Go from 160 FPS to 60 FPS. What people do that :slight_smile:
It the stupidest setting in a while debate anti aliasing a few percent here or their but 1/3 the FPS. Thats clown town.

I know a sample size of one game. Not my fault.


#348

I kind of hope it is a bust, in the same way that PhysX for visuals was a bust. It.is still used and runs physics for engines but the Borderlands 2 style of PhysX for purely visual stuff is all but gone.

I am fine with ray tracing staying around as part of engines and overall game betterment. But I hope RTX software dies or at least just becomes a card name for hardware aimed at ray tracing.


#349

RTX in it’s current form of being more or so optimized for 23.976 fps delivery is mostly a VFX or 3D animation house’s best thing, because being able to rapidly prototype scenes in a movie means more time to focus on the movie, rather than thinking too hard on the IT backbone.

For games, too much of the tech has to be cut down to optimize for real time delivery. If you look at the Control teaser trailer, the noise on the shadows is indicative of one of the major flaws of using less light bounces and trying too hard to polish a *explicit word* for real time delivery.

YES, SSR has some major flaws cause it uses the player viewport and objects in it to make sometimes inaccurate reflections, but it will still take a massive compute power increase so that the Tensor cores aren’t still polishing the *you know what* noisy output. (Mythbusters weren’t allowed to say that word, I’m not saying it neither.)


#350

I would have thought a sector as large as movie making would have all ready had their own specialised hardware for exactly this rather than wait for a gamers GPU to be created, and not even looking at the workstation and professional hardware.

Hmmm

Also getting off topic. Fuck 24fps, movies are nearing unwatchable with the high speed action now. Anytni g that helps further that acn fuck right off.


#351

30XX next year? Only way that happens is if the 2 series gets a shitty rebrand


#352

Not really. Now that they have the base worked out all they have to do in upgrade and expand. The hard part is over for R&D, the cards work. Now its just a case of more is more for next year.

I will grant they make take a year and a half, maybe two. But unless the launch has been catastrophically bad they will push for more.


#353

Turing is a long play, not a short one. Nvidia has spent billions on it, it ain’t going anywhere anytime soon.


#354

Highly unlikely.
Without a magical tier process shrink, they will have to reengineer everything to split it into chiplets (like AMD is working on since 2012).

I don´t see the 30-series before 2020.


#355

Price went down $500 from the previous Titan, so thanks nvidia!


#356

Nvidia will go to 7nm just like AMD in short order.

Guaranteed: RTX2xxx will be a short lived series due to the oversupply of Pascal and imminent volume production of 7nm parts.

It may just be a simple process shrink (i.e., they won’t throw out the R&D they’ve put into it), but i highly doubt that Nvidia will be sitting on 12/14/16nm process for years while AMD is on 7nm.

If for no other reason than they would get totally owned in mobile, which like it or not is the largest segment of the market these days.

Also… by next year, at this point i mean within 12-18 months - not like… january. Of course they won’t replacing turing before say early 2020 at this point. we’re only 24 days away from 2019…


#357

Yes, they will go 7nm/10nm, but I suspect not till 2020 and I expect it to be 2021 at the earliest. Nvidia aren’t known for making such dramatic technology shifts; AMD on the other hand do. There were rumours that Turing was actually meant to be on 10nm but the yields were terrible. I suspect, Nvidia’s next architecture or probably more Turing refresh will be on 7/10nm but it will not be anytime soon.

We also have to remember that AMD are only just going 7nm in 2019. Vega instinct are cherry picked and so don’t really count.
Vega 2 would be nice since asynch compute is probably going to become a bigger thing now. DX12, Vulkan and Turing has all but guaranteed that AMD could simply have one chip doing specific tasks and the other another as opposed to both doing the same thing to boost traditional performance


#358

TL;DW: DLSS works well in terms of performance but overall looks worse than TAA. Also DLSS requires a bit of work upfront because the deep learning … well, has to happen before you can use it.

My opinion: So far this does not look like progress.


#359

DLSS looks like I have to clean my glasses. Slight wash, colors mushed and text blurry.
On other scenes, DLSS makes the lines look like AA is off. WTF?

I would take 20% performance hit to not have flickering textures.