Anyone see the CPU benchmarks for GTAV by digital storm?

Just saw this video today
Https://youtube.com/watch?v=jNigQD5xkqw
What a difference $750 can make... A whole 10 fps. If anyone has the desire to upgrade their systems I would probably wait. I would think with how expensive ddr4 and the x99 platform is they would be putting up crazy numbers.

This doesn't mean anything..

All this tells us is that it's properly optimized for PC unlike other Console ports. which according to multiple reviewers this is the case. if this was DX12 the 5960x should show at like a 40 to 50% ish increase to a 4770K. cause it has double the amount of cores and DX12 is taking advantage of multiple cores. so really this is nothing. what would be more interesting is if Rockstar gave us a DX12 option.

1 Like

What do you mean it doesn't "mean" anything? Facts are stubborn things. This game is one of the first to use close to if not all of the 4gb (sorry 970) that most higher teir graphics cards have available(without mods). It's pushing over systems left and right and you don't think it's interesting that the fps spread between the top CPUs (Amd and Intel) is only 10fps? It's a $75 per frame difference. You could buy a 295x2 for that...

Well first of all. One source isn't enough. and two, according to these charts, the 5960x is one frame above the 4770k. which means the 5960x is a waste of money for gaming. that is why i said this means nothing.

Yeap. Like owning a Ferrari and maintaining the same 40mph speed limit as a Honda Civic..

Its just one source yes, but you are just pulling theoretical performance percentage increases out of the air with regards to dx12 optimizations without citation. You were speculating as to the benefits that dx12 would provide the 5960x... I thought the information originally presented was interesting as current hardware figures for real world processes. That's all.

That's fine. but i still stand that this specific chart means nothing. what i would like to see is what the results would be if this were a DX12 based game. in theory since DX12 is taking advatange of more cores. the 5960x would slaughter all these chips.

Cores or threads? The information on the features of dx12 have been a little sketchy. If it's just cores the 9590 would see a big increase as well. Not to mention even the Intel 6 core or the upcoming 12 core zen chip. Isn't dx12 not coming out until December with windows 10, not assuming any delays?

Specifically cores. yes the 9590 SHOULD see an increase in DX12 based games. but we won't know till we see results. that is the reason AMD is banking on DX12 right now. it'll give all their current CPUs a boost in games. which is what they need right now cause AMD has nothing to offer in 2015 besides GPUs.

I think they are probably more excited that dx12 is supposedly going to give developers lower level access to the hardware. A lot of games show lower performance on amd 's cards when on paper the amd cards should excel. A big give away is Nvidias logo splashing on the screen before a game. Lower level access to the hardware is why games with mantle show a huge increase for amd cards, and I have a feeling that dx12 will level the playing field significantly.

Erm, what? A 40% increase in performance? A GPU bound game should still perform fairly similar on a 4770k or a 5960x, even if it can utilize 8 cores efficiently. If the GPU does most of the heavy lifting then utilizing more cores would just lower the load on all those cores. If your GPU is already being utilized 100%, how do you expect a 40% performance increase by throwing more CPU power at the game?

Games are using 2 to four cores at most. We've seen the increase with Mantle as it is on the Graphics side. some games there's an increase in performance that is 10% to 15%. now that games in the future will take advantage of ALL cores you have? people that have 8 cores will have the best experience ever. all 8 cores will be used.

Source: http://wccftech.com/nvidia-amd-ready-generation-directx-12-api-showcase-features-benefits-d3d12-api/
Source: http://www.extremetech.com/gaming/187970-directx-12-reduces-power-consumption-by-50-boosts-fps-by-60-in-new-tech-demo

The mantle games that I've tested myself (BF4 and Thief) only had that kind of improvement because they saw a significant boost in GPU usage. BF4 is a resource hog in general and Thief overcame the extra CPU overhead from crossfire by enabling mantle. Both of those games had issues where the GPU couldn't be utilized 100%, that's why they saw a performance increase.
That said, GPU bound games that already utilize your GPU 100% because they simply don't need more CPU horsepower probably still won't benefit from more cores. Let's take tomb raider 2013 as an example, if we made a directX 12 version of that then it would probably have a performance boost in general (due to directX 12 just being more efficient) but the difference between 4 and 8 cores would probably still be minimal.

I'll run a test with Thief and see if single GPU sees a large performance increase with mantle, brb.

Yup, mantle doesn't offer a real performance increase in Thief if I use a single GPU. It's a GPU bound game that has enough CPU resources.
Mantle:
44
77
53

DirectX:
42
74
51

It only makes a large difference if I enable crossfire, the added CPU overhead bottlenecks the cards in directX mode so the performance increase with Mantle is huge.

Will directX 12 enable completely new scenes and technologies that weren't possible before? Absolutely! Will it help AMD with its lower performance per clock 6 and 8core parts? Yes, definitely. But saying that we will see a 4core CPU outperformed by an 8core part by over 40% is still ridiculous in a GPU bound game. You also need to keep in mind that games will still be developed for the mainstream PC, and that PC still has a quadcore at most. Even if they could do physics and other stuff that could really push an 8core CPU (which would probably make that 50% performance difference possible), they simply won't do it because they would cripple the vast majority of users that way.

The increase in gpu usage is likely because the cpu can dole out tasks better under the more efficient api. Just saying.

Who games at this resolution? According to steam most people game at 1920 x 1080 34.19% or 1366 x 768 26.62% for single monitor setups. For multi screen it's 3840 x 1080. Who cares anyway with a highly optimized AAA game and all you see if a 10 frame difference?

As per usual, an i5 is just about all that you should bother paying for for games.

With all the delays this game had, I think they could have done better optimizing it. Frankly if they used all the 8 cores of the 8350 they should be able to have it run just as well as the 5960x does.
Edit: And yes I realize the 5960x is way more powerful then an 8350. My point is that neither should the 5960x or 8350 be the limiting factor if both had proper optimization. It should be the Titan limiting both of them.

I think the moral we learned on this thread is to save the money from not buying a top tier CPU and buy more graphics cards!

yes of course most games run better on a 4690K or 4790K then on a 5960X for that matter.
Because they have higher stock clocks out of the box.