AMD Live Stream & News

Law of averages ? With reviewers being just a niche aspect of it.

I found that Overwatch vs. AI matches are perfect for Fury OC testing because even +1MHz too much in core crashes that game anywhere from lobby - second round. Bad memory seems to turn Overwatch green.. so thats easily notable haha.

Anyways try that, my Overwatch OC results so far are these.
+0mV 1055/500
+6mV 1085/500
+12mV 1092/500
+18mV 1094/500
+24mV 1102/500
+30mV ????/??? reached my weekly OC limit.

The plan was to find first 545 memory and then do proper post about it but eh, there you have it for now. Compared finding same crashes or corruptions in Valley I'm saving around 45-59mins like this, not bad finding day or two after I decided that for the first time I'm going to OC them graphicz. 8/8 tiny bit proud.

Both AMD and NVIDIA have a habit of excluding sites from receiving sample cards based on their review scores, thats probably why you see stuff like pclabs posted, deliberately ensuring they get more NVIDIA cards I would say.

For example the guy over at Phoronix (Linux) finds it hard to get anything off AMD because well, in the past AMD drivers have been amaze-balls bad under Linux compared to NVIDIA (changing now however).

and it can fit in tiny cases.

i think that's the best approach. You see, everyone sane would talk about crazy throttling of 1080 temps, and noise. But reviewers didn't, as they were not the customers. If they had bought the card for their own money they would be far more angry on about it - but they didn't - so they were very happy getting new card... and wanted to show it in best light.

Yes there are customers who specifically will augment their results - but that's a small number of people. Our forum benchmarks are to prove reality and actual performance of the hardware rather than silly reviewers.

There are few big problems with reviewers.

1) They use same workbenches to test their GPU's.
This may not be obvious, but either driver is leaving all crap behind...
2) When used on same workbench, nvidia game-ready drivers don't actually give you the actual settings in the game.
The configs are being overridden like (tessellation override on amd) but most of those configs stay even when AMD card is plugged in. Thus your settings on ultra might not be the settings on ultra for the other card. (witcher 3 is detecting what card you have and ultra differs with settings - i disabled that on my witcher 3 copy)
3) Reviewers don't try to get best of the card, as they only run few benchmarks for 4h and then its a good buy time for verdict. They don't wonder or look at the quality of picture its problems - they simply don't talk about that... they used to talk a lot about niches of cards... and compare the visuals between many of different models and cards -- it has shown that many drivers cheat somewhere... and i'm sure it happens today too...

-- like default optimization by amd on witcher 3 gimps tessellation to 8x.

just go with it man ;)

1 Like

Another thing to consider is many games deliberately gimp AMD hardware by 'configuring' their tessellation levels for NVIDIA cards (32x for example). As we all know 16-32x tessellation is STUPID and just hurts performance with almost NO gain. 4-8x is the generally accepted sweet spot, which you must force enabled on AMD drivers, which MANY review sites seems to overlook when comparing AMD to NVIDIA.

JayzTwoCents did one better and did an entire video on it:

At least he has the card to talk about the thermals and such. I see you as being no better, talking about how it has thermal issues and a noise problem, when you don't even have a Gtx 1080 to test.

1 Like

10PM start here
https://www.youtube.com/watch?v=Ljdg1J0XBSs

2 Likes

i think 16-32x are good, problem is witcher 3 can run with 64-128x tessellation by default. (i'm sure nvidia doesn't run with that tessellation either - as by visuals i can state they are running around 48-52x)

i force witcher 3 to run at 32x on my rig.

Compare it to 8x, or 4x. I don't think you will notice much difference.

i see the difference :) thats why i'm sitting at 32x, its not that big hit on performance though as x64 or 128x

Do you half to stop and zoom in on Geralts hair to notice it? LOL

sometimes.

again its not that big hit on performance fury or 290x. From 16x and 32x.

I did try forcing that tessellation down in Witcher 3 release but turned it off as water became muddy or however I should describe blue gel. Didint bother testing more back then, but maybe I will do that this time because of mods.

try with 32x, it simply doesn't get better than 32x.

on all of my ss, i run x32.
i'll post some new ss (just made in 1440p) in lounge now showing off hair and water with 32x

Seems alrite to me,
are you forcing AA from driver / mod? or is it just that the default blur is softening that UI?

My older shot for comparison purposes.
http://images.akamai.steamusercontent.com/ugc/573440425649959570/79CAD6AE0D5BA02D8346D3537EEC271543A4D244/

Youtube streaming:

Twitch:

1 Like

i am forcing adaptive multi-sampling, and morphological filtering (which gives off kinda blurry feeling - while its not), also highest possible settings (and then some more detail, through configs), then again through turbo lightning mod injector, and few other mods to sharpen up textures.

Someone post the things that are happening as they are happenig so those who can't watch live can follow along after the fact.