This really makes me want to buy AMD. Only publicity and your wallet can change this crap.
http://wccftech.com/nvidia-hairworks-sabotage-amd-witcher-3-performance/
I am glad this is getting some attention, Sad it is from users and no one in the media is picking it up. It goes so much deeper than just AMD performance.
Apparently there is a workaround so AMD's performance won't tank with HairWorks - override the Tessellation settings via CCC.
Anyway, in hindsight I really regret buying my 760. I bought it when cyrptocurrency mining heavily inflated AMD cards, making the 760 the only option available with the performance I wanted at a price I could afford. I was thinking of waiting it out, but G-Sync announcement sent me over. Looking back, G-Sync is a bust for me, only supporting one input (I do have 2 consoles now, and a laptop; ended up getting FreeSync Monitor), and is way too expensive. And then all the NV antics lately, this may very well be the last NV product I buy for a very long time, at least until NV gets a change in management.
NVIDIA tried to pull this stunt a year ago with their GameWorks software in games like Watch Dogs, and one of the Crysis games, where there was a piece of code in a .dll file that made performance go straight to hell if using an AMD card with GameWorks turned on (and because it was a .dll file the developer couldn't go in and change the code and add support for AMD cards, or whatever), and they've been called out on this, so the fact that they're trying to pull the same stunt a year later is interesting to me.
It's a bit of a fix for the hair works problem. I don't like the way hair works looks, so I keep it off.
The crazy thing is that hairworks is horrible for nvidia cards too. It's just not as pronounced as with AMD. An ArsTechnica article describes this:
"Over at German site Hardwareluxx, they found that turning on HairWorks dropped the frame rate on a GTX 980 from 87.4 FPS down to 62.2 FPS, a performance hit of around 30
percent. The situation was far worse for AMD's R9 290X, which took a gargantuan hit from 75.8 FPS to 29.4 FPS."
It seems like this isn't only a AMD/Nvidia shenanigan, but a PC/Console trick too. All the console fanboys can say, "Hey these graphics are as good as the pc!" when the pc graphics have just been severely crippled.
I can't seem to understand why devs buy into the whole Gameworks thing. The settings aren't even that great, to be honest. Hair works could actually use some work. HBAO+ Isn't even that great. And why use msaa? I could just render the game in 4k and down-sample to 1080p. Gets rid of all the jagged lines, and it runs better. Kind of fucked up that running the game a really high resolution uses less resources than simply getting rid of jagged lines.
Another thing kinda f'd up? Loosing 30 - 50% performance for a hair texture....insane.
That's shady business practices for ya.