GPU Optimizations

A small thread to discuss gpu optimization both nv, amd and even intel. Not a who's better thread - its obviously intel who wins. How could we optimize rendering / storing / overall performance of GPU's.

For a start I would like to recap all nice optimizations that maxwell brought. The technology packed into those cards was enormous. NV has created natively processed optimizations which are/were game changer even though they lack the horsepower of AMD.

The biggest optimization that maxwell natively supports is Tile-based Rasterization
This optimizations gives a lot of head room against unoptimized game api code and amd cards.
You are getting massive gains in performance, memory taken, but if someone had keen eyes they would've seen that there's some trick involved. Still its optimization worth existing, and losses are too small to even care about them.
(this is also one of the reasons, why maxwell gpu's handled tessellation so much better than Kepler cards - while they were pretty much the same thing.)

You can read about maxwell's optimizations here (its tile-based rasterization)
http://www.extremetech.com/gaming/232771-targeted-testing-reveals-secrets-of-nvidia-maxwell-pascal-power-efficiency

The latest optimizations i have noticed was while I was testing my friends 1060 in my server rig. I had games running on same machine and same monitor. I compared it side by side to my fury-x, and 780 to notice something very interesting.
I think its the newest optimization by NV, or maybe its something that was done, and hidden away long time ago.

The colors on 1060 in games seemed washed out, (i don't mean literally, but dark areas were shades of gray "black" rather than just simple black/dark. Same amounted to other colors (please note i tested this rendering it at 4k and up-scaling it to 8k to really notice the difference in color palette. The Pascal was rendering a lot less colors. (thanks to my research of SDR and HDR i became quite familiar with the idea) i thought it was something up with my setup, but i couldn't find anything, anything at all that could've cause it on my side. My whole view was confirmed as i watched rx480 vs 1060 in bf1 in dx12

and thats not all of it if we divide screen even further we can notice that there's even less colors on the edges of screen.
The question remains how much performance would that actually give? If you would cut off colors and replace them as same - by making dynamic range smaller than it should be - rendering effectively less bits.
(I don't have maxwell card to compare but i think it may have same optimizations or at least similar to pascal in color palette)

So yeah I would love to hear your opinions, ideas on improving performance, tricks in hardware/programming. What they should do etc. How much we would gain from such.

"Lacked the Horsepower for AMD"

GTX 780 Ti vs R9 290X: 780Ti performed better.
GTX 980 Ti vs R9 Fury X: 980 Ti performed better (although HBM is sick stuff, I want some).

I am just glad AMD has improved a lot on their drivers since their Fiji cards launched.

by horsepower i meant TFlops.

Oh, yeah, technically, the R9 390 was close to the GTX 980 Ti in TFLOPs.

But since that's the case, this puts AMD at last place for GPU optimizations, NVidia is second, Intel is first, although their iGPUs perform like crap compared to dGPUs, and even not counting Iris Pro it's crap compared to R7 graphics.

i don't disagree, but shows how much untapped potential there is if AMD switched to tile-based rasterization render engine instead of their current one. + memory compression was non existent on hawaii chips (290/390) so it couldn't get full bandwidth out in reality - rx480 is getting more with their new gcn, as they added memory compression.

1 Like

I thought the R9 290x was pretty much on par with the 780Ti?

1 Like

on release date they were not, nvidia had huge advantage. Today overclocked 290x beats 780ti and even trade blows with overclocked 980's

1 Like

AMD Driver maturity ftw.

1 Like

The R9 290 stomped the GTX 780, cornering NVidia into lowering the price quickly on it and even made a 780 Ti to beat it and the 290X.

But then gimped drivers happened and AMD drivers improved and now a GTX 780 is more like a 280X than a 290 in new titles.

what connection was the monitor using with the gtx 1060 odds are the YCbCr color space was enabled instead of the full rgb color space.

that said pascal dose have "Fourth generation Delta Color Compression" meaning a 700 series card has 2nd generation Delta Color Compression.

the most interesting place to look for gpu optimization is video. since thats the most common task a gpu will be used for as resolutions get bigger. hvec main10 is decoded by all 3 companies chipsets but vp9 support has been experimental/not working on pascal and polaris so far. it will be interesting if kaby lake succeeds where the other have 2 failed. then again nvidia could fix this in codec sdk 7.1 along with adding support for decoding hvec main12.
http://us.download.nvidia.com/XFree86/Linux-x86_64/370.28/README/vdpausupport.html#vdpau-implementation-limits

yes i'm aware of those settings, everything was at default, and it was running on rgb with full dynamic range. I saw same thing on dvi, and hdmi cable. (cables of quite the quality) -- i couldn't test through dp since its used by my fury-x on my monitors -- i also use hdmi on my fury-x. I tested it on 2 monitors with exactly same settings; and had exactly same results. I see same things on internet while people are streaming or yt recordings of 1060's. I mean this doesn't impact in bad way the player, some may even prefer it that way. - though its something that i've noticed - and i think its cool that nvidia is making such moves/tricks. Because only few people would actually spot those things. A normal gamer or even a professional might not see them - as they are pretty much invisible to naked eye.

i found out thread that pascal has issues with sending signal over hdmi
It describes my exact issue that i saw - looks like running it in full dynamic range doesn't fix the issue at all.
https://pcmonitors.info/articles/correcting-hdmi-colour-on-nvidia-and-amd-gpus/