8th Gen console CPU alternatives

Ok I have to ask for people who know more about CPUs than I do, with the 8th Gen consoles. I get that the CPU choice was limited by the desire to have a single monolithic SoC, which limited the CPU choice to AMD (mostly), but surely Jaguar was a step down over Bulldozer or its derivatives? I get that a Bulldozer “8 core” would have pushed up power consumption, but is that really such an issue for a home console? It seems like we’ve had a generation of gaming limited by the consoles running on AMD’s equivalent to the Atom when even as bad as Bulldozer was, it was a superior alternative that was very available back in 2013.

1 Like

Wrong.
Jaguar have better power efficiency and clock per clock is about 30% faster than Bulldozer.
Remember when AMD said they expect Zen 1 to have 40% improvement over current generation at the time? They were talking about Jaguar. But since Jaguar was never released as a mainstream CPU they just compared it to the Bulldozer on release and stated they outperformed expectations.
Also Jaguar was built with HSA and HUMA in mind, that are massive help for an APU - basically it allows CPU and GPU to access the same ram data at the same time, so you don’t have to allocate memory for the GPU and the ram don’t have to copy data from the CPU part to the GPU part.
Also, there was rumors, Sony and Microsoft went to Nvidia first and Nvidia did what Nvidia does best - they alienated both Sony and Microsoft and gave them such high prices, both consoles just went with AMD.

But purely based on your question - Jaguar have massive performance, features and efficiency improvements over the FX era bulldozer cores.

1 Like

Well sure Jaguar is more efficient and has better performance per clock than Bulldozer derivatives, but the Bulldozer clocks way higher hence the better overall performance. I think if the consoles were battery powered then it would matter, but I don’t think there would have been any problem with just taking the higher power consumption of something like the FX 8320 running at double the clockspeed (1.6GHz of Jaguar to probably about 3GHz).

1 Like

This. It was only ever realized in the elusive A13 laptops that came out just before or just in the middle of the Zen 1 release and I think only in Asia and Germany.

I honestly love Poor-Dozer and I am still rocking the 6300FX right now until I can build a new PC and not pay the price of a small car.

1 Like

That is true on one side.
On the other side you need to cool that thing in the small form case with the tiny fans without insane noise. And remember, AMD had bad heat and efficiency at the time. You had literally 270/270X class GPU. Those were 150W GPUs and then you add extra power for the CPU.
This thing needs cooling.
Remember what happened with the PS4 Pro that added more powerful GPU and OCd the CPU? People on mass complained how noisy it was.

1 Like

Yes, be cause it was always designed to be a mobile platform until AMD had to abandon CMT for SMT, because developers were not designing proper multi-threaded software yet.

Also, if you do some research, you will see that Jaguar and the last of the heavy machinery CPUs actually suffer preformance at high clock speeds and consume an exponential amount of power for little gain.

Very untrue. Clock speeds have nothing to do with IPC and the flip side of that is that the other compenents have to be able to communicate effectively with the CPU. You can’t just over clock a CPU and do nothing else to the other components. This is were power consumption because a thing. GDDR consumes much more power than DDR and that adds to the over all heat envelope of the system. Look at the Form factor of the PS4 and look at your PC case. Looks at the PS5 and then look at you computer.

See above. Also smarter people than us work on these things. There is a cost to performance to effort trade off. In the triad, you can only really pick two

2 Likes

I mean, maybe it’s retrospect talking but between the Xbox Series X, the PS5… these are monster consoles with huge cooling budgets. You’re still talking about 300W for both CPU and GPU tops, which is a lot for a console, but manageable. There are standalone GPUs that manage this amount of heat output.

You seem confused, I don’t really care about IPC. The only thing I care about is gaming performance. Jaguar has higher IPC than the 8320, but it is clocked at 1.6GHz while the 8320 clocks at 3.5-4GHz. The IPC of Jaguar is not so much better that it makes up for having literally (less than) half the clockspeed. This is why the Bulldozer* CPU performs better

*I know I’m using Bulldozer, Piledriver, Excavator, etc all interchangeably. I don’t really care enough to be precise here.

Edit: I’ve just grabbed a tape measure and looked at the dimensions of a PS5: it’s not actually all that far off my PC case. I also have a 16 core CPU and a 3090, which is a little bit more than I expect of even a Bulldozer-based console.

I call the collective heavy machinery. It may work for you.

Honestly, I think you are confused because you are treating the two as mutually exclusive and they are not. Full stop. When you put too much voltage through a system, the gates leak which causes errors, which means you need to account for them by spending more cycles doing CRC and correcting any errors encountered. How do you ramp up clock speeds on a CPU? The most simple way is to drive it by upping the voltage. Higher clocks on Jaguar makes performance suffer, thus you find the sweet spot to were the different curves meet. → Differential Equations.

While yes the low clock speed does not mean the the Jaguar CPU has double IPC of the other heavy machinery CPUs, but it also has a GPU that it has to feed and supply space on the die to. Some of that voltage has to go to the GPU. Just because they are on the same die does not mean that they share resources. An APU is much more complex than a straight CPU.

To help you better understand, you may be able to find some videos that will explain how one can affect the other. If you really want to get your hands dirty, would recommend taking a computer architecture and computer design course. It is bog standard for any CS degree and most Electrical Engineering degrees.

Those are kinda connected…

That is not the most important thing. Especially at the time when the programming started pushing towards heavy multithreading.
Yes, you may have been able to get what 10% better frame rates for doubling the power usage. It is not worth it. Also they focused on the GPU.
At the time of release 7850/7870 was among the high end of graphics.

The thing is, the consoles back in the day were made on 28nm. Now the new consoles are made on 7nm. You literally quartered the size. You can add so much more in the same space.

Look, I get your point. But it’s not as simple as this.
Especially with that GPU. At the time even 8320 was more than enough power for 7870. 8320 was perfectly fine for 7970. What I’m saying is a downgrade in the CPU won’t cripple the graphics, and when you think 30% better IPC with 50% lower clocks we have like what, 20% lower performance for quarter the power use and basically no effect on the GPU performance.
It’s like putting an i9 CPU on RX480… You will get the same result with i3…

3.5GHz isn’t an overclock, it’s the base frequency of that CPU. No one is “ramping up” anything. Bulldozer is just designed to work at a higher frequency than Jaguar.

Not to the extent you’re talking about where a 1.6GHz atom-equivalent is able to outclass a 3.5-4GHz desktop class CPU. That just isn’t what is observed.

This was the generation that gave us 4K 30fps because the graphics were classes ahead of the CPU.

And how much bigger is it compared to the PS4? My point was that Sony went more of the brute force route with the PS5, hence all of the fat shaming people threw at it when Sony unvailed it. MS went the route of separating the main boards to increase surface airflow. This makes their design more complicated and hence more manufacturing errors. Again, I point back to the Triad.

It’s irrelevant because I would just have redesigned the PS4 to accommodate the cooling.

You said increase the clocks on Jaguar. Bumping ut up to 3.5GHz would be an over clock… unless I read you wrong. Either way, the Jaguar CPU was based on the mobile design and thus its power envelope and base clocks were a conscious choice. I wish we would have got a desktop Jaguar or A13 chip during the heavy machinery life cycle but we did not. I don’t think CPU clocks would have made much of a difference fort the PS4 except for load times. The biggest issue was feeding the GPU, keeping it cool enough, and supply fast enough memory access.

No, I was comparing the base clock of the Jaguar with the base clock of the Bulldozer. In fact I was slightly underclocking the Bulldozer to 3.0GHz in my example. I was suggesting that the higher IPC of the Jaguar does not enable to it overcome the clockspeed advantage of the Bulldozer.

It is not irrelevent, they had parameters that they had to stick to for each console. The design in total is important. That is my point.

I will just call it here. The answers were given. What you do with them is on you, but this conversation will go nowhere.

Those parameters were chosen to accommodate the TDP of the device. A higher TDP would have changed those design parameters early on, just as they did with the current gen.

@cakeisamadeupdrug
Ok you aren’t listening.
It’s not as simple as pump up the speed of the CPU and the games will fly.
There would have been no effect on the games other than pushing the power draw and heat output…

1 Like

I am listening, I just disagree with what you are saying. The last generation was hard capped at 30 fps because of a severe CPU bottleneck – so much so that they pushed 4K 30fps instead of 1080p 60fps because the GPUs were that much more capable than the CPUs.

We also saw severe limits in map design, in scale and scope of AI because the CPUs just couldn’t handle anything more complex. At one point Microsoft were looking at offloading CPU tasks onto the cloud because the consoles’ CPUs were so inadequate.

Where did you get the info it’s because of CPU bottleneck?

The consoles didn’t push 4K. That was Nvidia. The PS4 and XBone was talking 1080p since release. Even at release the XBONE was dropping below 1080p.
Nvidia was talking 4k with every GPU release. 4k was here since the first Titan was released and today we need upscaling to play proper 4k. Nvidia was pushing 4k, not the consoles.

Are you sure you are talking about PS4? Cause that is when Assassin’s creed started talking about 200NPCs on the screen. The open world’s got larger and larger.
The gaming industry is crippling games not because of the consoles and especially not because of the Jaguar CPU. This is completely different conversation at this point, and believe me, the issues you are talking about have nothing to do with the CPU clocks…