Nvidia RTX 20XX Thread

Simple reason: You would not buy a product with one feature marked “worthless in 1080p”

It just works!

Nvidia Vaseline on the screen™

1 Like

Easy sauce recipe for DLSS:

Take 1:

Then smear across screen:
aZNQfe1u9r

And you too can have the DLSS feature enabled!

1 Like

The way its meant to be spread.

2 Likes

DLSS looks like Vasoline so much so much STP going to file a copyright strike against NVidia videos.

1 Like

Even I didn’t expect it to be that bad…

Conclusion timestamped:

2 Likes

Much more in depth look into the workings of DLSS:

1 Like

The TL/DR is the tech is fake news.

The reason could be many things from not ready to utterly broken.

I think it will get better over time. Optimization will depend heavily on nvidia, so it comes down to how they handle things. Can anyone get their game “DLSS Ready” or will it be locked out behind nvidia partnership deals? My guess is that this will end up like G-Sync. AMD will come up with something similar down the line, it will be open source, more people will use it, nvidia will eventually relent.

I doubt it. Because:

:wink:

1 Like

Not really.
The “educated guesses” the neural network does can only do so much.
Getting better quality would require much more nodes reducing the performance gain to a net 0.

Algorythms like bilinear, bicubic, sinc, lanczos, fourier interpolation, edge interpolation or vectorization allready exist. Mixtures of them find use in most modern televisions and consoles.

DLSS isn’t complete bullshit, but it isn’t magic either. It’s just another upscaling technique and it has a quality hit just like the very popular checkerboard rendering. Now that quality hit is probably imperceptible to the vast majority of consumers, which is why people believe the PS4 Pro and XboneX can do real 4k when even the XboneX GPU is about as fast as a RX-580.

The problem is it’s proprietary to Nvidia hardware, and since it isn’t markedly superior to non-proprietary techniques, it deserves to die. You’ll only see it in games where Nvidia pays off the studio to do it.

I do think some sort of neural network trained upscaling technique will be added to vulkan and directX soon enough.

it has its place on the 2060 and 2070 but no need on the 2080 and up they can run 4k native at or above 60 fps is most cases.

Not with raytracing on, and raytracing definitely isn’t bullshit. Besides, who wants to sit around at 60Hz? I can tell a real difference up to 90Hz, and some people claim they can feel 120/144Hz too.

meh real time ray tracing is about 3 gens to early for FPS games at 4k. possibly 4 gens to early.

1 Like

Yes, that’s the point of upscaling, it actually runs at a lower resolution but fools the player into thinking they’re running at 4k.

Why though?
You can achive less noticeable upscaled images with normal mathematical upscaling than DLSS can do by “guessing” wich pixel is what.

In case of nvidias “upguessing”, it does not fool anyone. And knowing how upscaling with Waifu2X looks, this muddy/blurry look in DLSS is a limitation of the tech.

One of the images is upscaled with lanczos, the other with waifu2x.

Example 1



Example 2


1 Like

It’ll be added for the same reason checkerboard rendering exists, because hardware isn’t fast/cheap enough yet and people want to say they’re running at 4k.