Be careful with over clocking

I see lots of posts about over clocking. See some posters suggest they over clock this or that cpu.

I think the one thing I would like to see, is people offer both sides of the coin. Even Make a Sticky at the top of this very forum. So people can make a more informed choice on the decision to over cock or leave it alone.

In My humble Opinion, Over Clocking a CPU is not worth the risk. That is just me. I don't see any advantage to it other than bragging rights. Think many don't realize when you do over clock the cpu, you are gonna shorten the life of the cpu, and you could very well fry your cpu if you don't pay attention to it.

I have yet to see any advantage gain wise in the real world that is worth the risk of over clocking.  I would love to see Logan do a video and offer both sides of the over clocking coin. Show us video of a over clocked vs stock cpu. 

Now I do consider the fact, maybe I am wrong, If I am, then please explain to me how I am wrong. I am fine with that. I am always ok with a nice friendly debate on any subject. 

Overclocking is just tuning your CPU to get maximum performance out of your particular chip and set of components.

In some cases it's just not worth the money. Like, I personally would not overclock FX-8350 because it requires a more expensive motherboard, a better power supply, and an aftermarket cooler compared to running it in stock.

So for AMD instead of making overclocking build, I would just get some mid-range components, overclock until their limit and just let it be.

And no, you generally can't fry your CPU by overclocking. CPU going over maximum operating temperature just causes PC to shut down. You can, however, fry VRMs on your motherboard, if it's a cheap (usually MSI) AMD board. 

Lifespan is not really an issue. Have you ever seen a broken CPU?

I have a feeling that Intel under clocks their CPUs

1. Because the stock cooler isn't good enough.

2. To maximise longevity.

3. To maximise reliability.

I don't see why so many people do it because it doesn't give you enough extra performance in games to make the cons worth while in my opinion.

you talk like it's some super risky thing to do... and it is, if you don't know what you're doing... if you aren't planning on reading up extensively on how to overclock, then don't do it...

a lot of people just dial up the multiplier till the system crashes and leave the voltage alone... then dial it back a bit once they hit the crash... and that's as safe as safe gets, you aren't shortening anything on your cpu's life, and you'll need an aftermarket cooler... your CPU will shut itself off if something goes wrong... now you can dial it in much more precisely in your UEFI, setting voltage min/max, counteracting vdroop, setting individual core multipliers, and milk it for all it's worth... but in most cases it's just not worth the hassle...

the main thing to pay attention to is heat dissipation, and there are different thresholds for different processors... for instance, 72c is the threshold for my i5 2500k... if it runs hotter than that I could be losing life off my cpu... at stock it's listed as 3.3GHz... i have mine overclocked to 4.5GHz with a very mild adjustment to voltage... it runs about 62c under load, which is about what it used to with a stock cooler and no OC...

if you're concerned about it, then don't bother doing it... and don't bother paying extra for k model intel chips...

but telling people that getting more performance out of their PCs is a dumb decision because you don't know how to do it properly is naive... people pay a premium for hardware capable of achieving stable overclocks... what's a dumb decision is buying said hardware and not using it...

"in games" ... some people do other things with their computers than playing games...

I do sometimes run my FX-8350 overclocked when I have a huge load of work that needs to be done. For example : when encoding a large batch of files in Handbrake, my overclocked CPU is noticeably faster and makes a big difference in the time spent encoding. As a bonus, I can heat my house really nicely when running it overclocked at full load for an extended period of time which is nice in winter time :P.

As far as gaming goes, overclocking my CPU and GPU does improve my framerate quite nicely (I either get more FPS and/or a more stable framerate) but I want my system to be as silent as possible so I keep everything at stock speeds. Also, I don't play the most demanding games and it's just not necessary for everyday use.

One thing to note. The reason we can overclock CPUs at all is actually because they ship clocked at less than there full potential in most cases.


I say in 'most cases' as it is a truth of the mass manufacturing of the item. In order to guarantee minimum levels of performance for an acceptable risk of falling below that target you have to bring the majority of chips down to the level of performance found in something like the lower 5% of chips deemed acceptably functional. You then see the comments about silicon lotto and good/bad overclockers.


Somebody else could probably be more articulate in explaining this, in any case my point is you are way more likely of overclocking into an entirely safe level of performance for the CPU you get than you are of damaging it by going to far.



The performance of the type of CPU is the distribution on the left in Fig.1. It is on average '1010 performance units' (just found the image on google so I'm making do). The level of performance varies about this number dipping to around 950 Performance units in the worst case. The distribution on the right in Fig. 1 shows how this is standardised for 1010 performance units now being considered zero.




Then say 95% of CPUs are what you want to send through to balance the costs and risks of defective products in the market. So the shaded region in Fig. 2 is the cpus which don't meet the minimum performance you want (for an 8350 they may become 8320s in this instance, for example, but that's a different topic really). So you would just make all the good 95% CPUs perform at the performance unit X in Fig. 2, which is the lower 5% point or whatever, and then market it as that with confidence that the products almost certainly will perform like that.

Fig. 2



Figure 3 is just for reference and to add more understanding.

Fig. 3



Disclaimer(because hate is bad): I am not claiming to be any expert, just trying to add in some basic background info that I am positive plenty already know. I am also sure I have left loads out and gotten bits and pieces wrong haha Somebody more qualified could jump in a help sort out any issues my post has though.

Personally as a person who CAD programs like SolidWorks I'd say overclocking is well worth it when the different analysis run much faster than they do at stock. Productivity is really why you SHOULD overclock.

There is no danger in overclocking as long as you don't touch the voltages. Also have a original core 2 that has spent it's entire life overclocked and is still rock solid.

OCing is safe. Overvoltage is stupid.

Depends on the game. Some of the new high end ones(crysis 3 for example) it can help. Personally I do a lot of coding and cutting down the compile it is always nice. Video rendering and a good number of things benefit from overclocking, including higher end gpus(for 1440p or 4k usually) like tri sli 780ti on an old sandy bridge chip would benefit form some overclock a bit. 


Personally I've had a 2500k running at 4.5ghz for a couple years now, always stays under 70C and never had a problem with it. Stays under 1.34v and as long as it lasts till the middle of 2016 when I get my masters degree and do a new build to celebrate it will have done its job.

OCing is fine as long as you have a decent power supply. Excessive heat and really high voltage shorten life span but you're still going to upgrade long before that happens, and whoever buys/inherits your old chip will upgrade before that happens too. I know a few people still running Q6600 @ 3.4GHz, plenty of people who are running 2500Ks @ 4.8-5GHz and probably will be for quite some time.

As long as you have a good power supply and motherboard, there's nothing to worry about. Now for gaming there's not much to gain as long as your CPU is good. Most games now run just fine on a cpu at stock and a 4670K at stock and at 4.5GHz will give you identical framerates (not always the case, depends how CPU-hungry the game is). But in other stuff like video editing it makes a huge difference

Have anyone thought the point of view that you would start overclocking your cpu later when it's performance is really looking to slow down(meaning of being little bit old). What I want from my pc to be silent and have a good performance. Corsair H100 cooling an i7-3970x and damn it is silent when im running my cpu at stock. I like it, the core temperatures (I am really perfectionist for low temperatures) are really low idling 30C and with load about 45C. In future I would overclock it when it seems to be the bottleneck of the system. Also if you are on a budget for computer and you will buy aftermarket cooler for the cpu and overclock it for better price to performance ratio, I would accept it. Otherwise I can't see the  benefit.


I think the same thing. I have overclocked my system for benchmarks. I have had my CPU all the way up to 1.6v which is safe under water. But I still run it at stock because it is not a bottleneck. The only area's that I would see better performance is in CPU rendering and honestly I dont want to see my system running 100% 24/7 at 1.6v. 4.8ghz. GPU's I will replace my GPU every year or 2 so the life expectancy is not a big consern. 

If I have a CPU bottleneck and my system is nearing the end of its run then I will overclock it to extend its life.