Maybe the 8350 has some life yet - 980 SLI vs 4930k

I meant to post this a while back and forgot about it. Apologies if this has been posted before, but I find it very interesting. When at higher resolutions, the gains from a more expensive cpu drop greatly (duh) and at 4k, the 8350 even pulls ahead in a lot of tests. Maybe the coming generation of 4k and DX12/Vulkan will keep the 8350 alive and kicking longer than we anticipated.

1440p

4k

EDIT: Updated articles with 4770k added.


Maybe though the article reads like the author was out to prove a preconceived point. They really should have thrown in a couple cpu bound games as well as a few more configurations.

1 Like

I have always been interested in seeing more cpu benchmarks. I would love to see it become the norm to have tons of cpu benchmarks on every review site, but that simply isn't the case. It seems that this author was sick and tired of the way things were and realized that people were pointing to the wrong things and wasting money and whatnot. He seems to have proved his point, but the bias is a little unnerving and annoying. I hope that didn't affect the benchmarking any. I would love to see more benchmarks, personally. Maybe Logan et al can set up some benchmarks. Maybe Pistol's rig now vs her old rig (x99 (whatever cpu she has) vs 9590).

Well he did prove the point that intel 6 cores are a waste for gaming and that spending a fortune on a cpu and motherboard isnt smart... but most anyone on this site could have told one that. my concern is that by presenting such a limited picture that the author risks leading the uninformed reader to the opposite extreme. an 8350 dual gtx 980 build is about as senseless as a pure gaming build running an intel 6 core.

but yeah it is always cool to see more cpu benchmarks.

It seems common to see people saying that an 8350 will bottleneck a 980 or SLI 970 or whatever. At 1080p, I suppose that it true. Crank up the resolution, and not so much.

I agree that is true in most instances and if you have an 8350 (like I do) its probably worth sitting on for a while longer. but still take those 1440p results they found the intel build 10-20% faster but what they didnt say was that 10-20% advantage would be achieved on a locked haswell i5 and a cheap board for a similar cost or that 20% is pretty much the advantage a gtx 980 has over a good 290x despite the being like 250 dollars more expensive.

I suppose I just wish the author would have thrown in a smart AMD build, a smart Intel build as well as a few more games to give a more complete picture for the uninformed reader the article is try to help.

Still cool to see the benches though. So yeah, thanks for posting them as I hadn't seen these articles before.

1 Like

It is common to see people say that X processor will bottleneck Y GPU. I find this strange for two reasons: It's never the GPU that they are saying is doing the bottleneck (it's odd because when you consider that games A, B, C are GPU bound, the CPU isn't being stressed enough to cause a bottleneck and in games X, Y, Z that are CPU bound... wouldn't things be CPU optimized, since it's CPU bound? Kinda like GPU optimization? Maybe? Just a thought) and a lot of times said CPU bottleneck isn't actually happening (like how I just described).

Sadly, I see few CPU specific benchmarks as far as games go, so I can't reference one here. It's ironic because with all the Intel vs AMD and budget CPU vs top tier CPU arguing that goes on... you'd think people would benchmark CPUs in games as much as they do GPUs, but you can find like 15 GPU benchmarks videos on youtube for every 1 CPU. And the only personal experience I can lend to you is Guild Wars 2, where my GTX 760 seems to be bottlenecking my overclocked 6300. I could be wrong that 100% GPU usage = bottleneck in this situation, but the reason I think that is because GW2 is a CPU heavy game...

2 Likes

The whole CPU will bottleneck GPU argument has always bugged me. First off, a bottleneck implies unsatisfactory performance, so the word is entirely over-used. It's also highly dependent on the games, other applications, and how each piece of software utilizes the CPU. Secondly, strictly speaking, there is ALWAYS going to be a bottleneck. Will the FX-8350 bottleneck SLI 770's? Probably, especially if you play at 1080p. Does the FX prevent you from getting 60FPS? No, so why does that "bottleneck" matter? And if you crank the settings and resolution high enough, you'll just shift the bottleneck back the GPU(s) handily, and at that point it doesn't matter if you have a completely stock FX-8350 or a Core i7 5960x OC'd on LN2 running at 6.5GHz.

And that was a really good point as well; unless the FX is at or near 100% utilization, it isn't the CPU itself that's the bottleneck, it's the game that's the bottleneck. When a dual-core i3 outperforms an FX 8370, I get the sneaking suspicion that a healthy chunk of that FX chip is sitting idle twiddling its figurative thumbs while the i3 is sweating bullets. And if you were to do something, idk, say capture said gameplay via software screen capture and encoder, I wonder how the results would change.

Raw performance

"CPU" bottleneck - playing on a 60hz monitor, you would not see any difference between any of the CPUs

GPU bottleneck (results are within margin of error; the difference between the <$100 CPU and the >$1000 CPU is the weight of your wallet afterwards)

3 Likes

To add onto what @jerm1027 said: http://www.techspot.com/articles-info/645/bench/CPU_02.png

Here we see that it took downclocking an FX-8350 to 2.5ghz before it started to measurably affect FPS in Tomb Raider.

The black for bottlenecks does lie with the software, not the hardware. (Just because it is not the best product, doesn't mean that it is a bad product!)

Anyway, even if the reviews/tests linked by the OP were cherry-picked to prove a preconceived point, I'm still glad they exist - it shows what even half-way decent coding and optimization can allow for. This is why Mantle was created - and thank god, too, because now we're getting DirectX12.

I agree these games are well optimized to work on a large variety of CPUs and be more reliant on the GPU. There are unfortunately benchmark "reviews" that tend to be one sided (for favor of either sides) so, more CPU reliant (Intel dah best omg buy intel 4 lyfe) and more GPU reliant (no AMD is g00d 2 you money wasterz!). Sorry for the crude stereotyping. In conclusion, if you play games that don't need that much in terms of a CPU, AMD is great for the money, but if you are playing games that will give a great benefit for going with an i5 or i7 (or rarely and weirdly a pentium) then an i5 would be good. I have seen some games "that will" benefit (10-50fps boost) from upgrading from an FX to a i5. And like here some that obviously don't (or so little as to not justify the extra cash). Research to see what'll be the best... Also I would like to point out that the i7 4790k is about between 1 and 2 years newer than the fx 8350. (just look at a difference of flagship GPUs in that amount of time).