CPU Speed wars

What happen to the CPU speed wars? like back in the 90s. it seems we have stopped clocked at 4 ghz ish.

Should we be at 10 ghz or 20 ghz. I know we have more core on the cpu but still

never stopped, its not as popular now, that amd hits 5GHz out on stock.

At least for desktop, mainstream use it is all down to Power consumption. Increasing frequencies becomes impractical. It just eats too much power and generates too much heat. Better to run at lower speeds and increase your instructions per clock (or IPC), which is essentially how much work you can do per each clock cycle, to get performance increases.

Intel had this problem notably with the Pentium 4 and it's Netburst architecture. Relatively low IPC but the idea was it was supposed to make up for that with insanely high clock speeds. 10 Ghz was projected. However they could never reach that. It took too much power and created too much heat. They could barely get it to 3.8 Ghz. Even then it was often slower than many AMD CPUs which were sometimes clocked as low as 1.2Ghz.

This is the "Megahertz Myth." The idea that high clock speeds mean a better and faster CPU when that isn't always the case. Intel used their high clock speeds in marketing despite their parts being slower. AMD countered by creating the "PR" or "Performance Rating" system and used that to market and advertise it's CPUs. So on a part that is clocked at say 1.8Ghz you'd see a number like 3200+. That meant performance of the part was as fast as an Intel part clocked at 3.2Ghz.

Nowadays AMD tried to do the same thing with Bulldozer. The 9590 will hit 5.0Ghz but just like Netburst before it, it eats power and gets very very hot all while performing worse than much lower clocked Intel parts. A 6700k at 4.0Ghz will in all likelihood be faster.

Intel has done well in keeping IPC and clock speeds high though with low power consumption which is why they are so good. Despite this I doubt that you'll ever see anything like 10Ghz with current manufacturing techniques. You'd probably never see over 5.0Ghz stock. The power consumption and heat output isn't worth it.

wernt they messing with crystals or somesuch ?

they used to use a Quartz Crystal Oscillator in some parts to generate a clock
if i recall ??

It's performance-per-watt race now. Get with the times, bruh.

Clock speed is only 1 part of the story. With the rise of mobile computing (in all its forms) performance per watt is the primary goal, along with IPC. Clock speed doesnt mean anything if a 7GHz cpu is doing less work than a 4GHz cpu with a much higher IPC rate.

Basically people realized it was silly. ;)
As said before: at a certain point it gets exponentially harder to raise the clock speed even the tiniest bit.
For the consumer around 4GHz is the acceptable maximum, for us crazy people we might want to push it a bit further because stupid software like games and adobe stuff is still not optimized for a bunch of cores.
For the commercial/server side optimal performance per watt is key. And that sits somewhere between 1.5 and 3GHz right now depending on the chip.

its down to physics really.

one of the issues intel ran into while trying to push their CPU's to 10ghz, is that they needed smaller nodes to do so, and for a few die shrinks it worked great, hitting 3-4ghz, then voltage leakage started to come in, at first it was a slight annoyance, then as they shrunk it became a minor problem, then a MAJOR problem, causing heat to go up exponentially, and for higher clcoks you needed power.

at that point you could get 40% performance gains with new nodes on a single core chip, vs 80% by using the extra space for more cores.

at that point mobile kinda took over and it became performance per watt, and since the desktop market is so small (only people who care about ghz) it went into being power efficient for phones and data centres.