Well I was advising one of my cousins about buying vs. building a PC. Since they aren't gonna do hardcore gaming I suggested a Desktop with at least a i5 or if they were looking for a decent one, one with at least a i5 3570k. My cousin's husband also asked about overclocking. After a few minutes of surfing I realized, OCing a 3570k vs. non-OC'ed there's barely any difference any more, almost all benchmarks and games really show them like less than 20% gain on games etc...
Anyways just really made me realized i5 3570k to 6700k they're almost really not that much changed in computers for the last 5 years. A big technological plateau sorta like what happened back at the 1Ghz wars or the 64bit era.
What you guys think?
Games have gotten much more GPU dependent in recent years. OCing the CPU will help you with productivity (i.e. VideoEditing, etc.). But, OCing the GPU will net you tangible performance benefits in games. OCing the cpu doesn't matter much for games but OCing the GPU is still very relevant.
It depends on the game. If a game is heavily cpu bound like lets say Guild Wars 2 over clocking 3570k or the 6600k can get you a significant gain in performance some have stated as much as 20% or a 15 to 20 fps increase. The 6700k dosn't matter as much right now because that is an i7 8core and most game still can only utilize 4 Cors but that is quickly changing. If you are trying to build on a budget it is better to go with the non overclockable cpu so that you can buy a stronger gpu because most games utilizes the gpu more.
But then there's people who do other things like rendering where it does actually matter
Since many of the games out there are GPU dependent and require a moderate cpu like the i5 3570k or something lesser. Overclocking regardless of CPU or GPU will show gains of performance but it will depend on the game and hardware.
As for me I think that overclocking is not worth the 'free' performance gain out of it. Just go overkill on the CPU basically getting an i7 or Xeon equivalent if possible since the extra threads are nice to have to be running programs in the background while gaming or doing something else such as video rendering.
This is just my 2c
i7 6700 turbos to 4ghz, overclocking has been rendered all but pointless for gaming tasks
Overclocking is over rated. It's not like a pc can't do something then you overclock it then it can. Sure a synthetic benchmark will have a higher score ore you see your fps counter go up or maybe a video renders a little faster but it still did it on the stock clock.
Oh god yes, from an enthusiast stand point, we have been in a period of stagnation. There are a number of reasons for this.
The first is because the mobile market boomed, so there was a shift in focus from raw performance to energy efficiency. Both AMD and Intel have been looking toward integration of various components to help with this, but Intel has been emphasizing die shrinks in tandem with architecture changes, while AMD has been concentrating mostly on tweaks and changes to their architecture. This is a very basic view of the situation that isn't wholly accurate - it's kind of like trying to say engines work by explosions. It's true, but saying an engine works by explosions isn't the same as explaining out the internal combustion engine.
Anyway, game developers have also recently smartened up with how to properly code a game. (And you can thank AMD with its Mantle initiative.) It wasn't too long ago that Intel released the unlocked G3258 Pentium and it was all the rage for price-performance, and now, just as quickly as it was popular, it has almost faded from memory. In fact, with DirectX 12 on the horizon, and more games supporting multi-threading, having only two cores (and/or two-threads) is barely sufficient and not recommended. Single-threaded performance is still quite important, but we're now seeing a shift away from that.
It depends on the chip/game/gpu at this point. I get a big boost in gaming from overclocking my 8350, but I'm also pairing that with a 295x2. The 8350 is older, and has lower IPC, AMD cards perform better with better CPUs (to be clear, the gains made from switching from say an i3 to i7 are greater on AMD cards than Nvidia cards), and I'm running crossfire which benefits from faster CPUs
Also, certain games are starting to show a preference for higher bandwidth memory in cpu limited cases, see digital foundry on youtube for more info
this one points out the gains from overclocking an older chip, specifically when paired with faster RAM
this one compares the 6700k and the 5960x
finally, this one compares a variety of chips
It's interesting to see these results, specifically, I'm very interested in how the 5960x loses out to the 6700k despite being far more powerful due to clock speed (or so it would seem)
While most gamers are limited by gpu at this point, that's not to say that overclocking a cpu has no purpose/little benefit. And that's before taking productivity like video rendering into account. I think there is still a healthy community for overclocking. Take for example the pentium g3258, for 50 dollars you can get a nice dual core processor. Slap a hyper 212 on and for less than 100 dollars you can overclock the piss out of it, usually getting up around 4.4-4.6, and have an extremely competitive solution, that without overclocking would be far less effective.
It matters more for older and slower chips than it does for newer ones. Which still isn't a lot.
My 4670k can 4core turbo 4ghz all day long but the normal turbo is like 3.4ghz. It's not a huge boost but it helps. For older chips like my c2duo 1.6ghz that OC to 3ghz , it made more sense.
Or stuff like the 2500k, OC that to 4ghz and it is still awesome today
Most those games could pull 60+ fps. Once your over the 60 FPS, having 10-30% fps increase is nothing, until you can get 50-100% increase in FPS, I mean you need at least increase of about 100-120fps to be a worthy OC. Literally no one will notice 75fps over 60fps unless they are on a CRT. In the summer I'm not gonna OC a relative's PC to cook their room for that 15 fps.
Thanks I really enjoyed those benchmarks but yea really sucks how gaming industry really have un-optimized code for games ATM. Yea that last vid really solidfy that a 2500k is still relevant and current when it is compared with the latest AMD & Intel chips.
When you have a stack of renders even the smallest boost in performance can make a big difference. every mhz counts
Another factor to consider is the price difference between an overclocking MB and a budget MB. Apply that difference to a better dGPU it bolsters the argument
OC'ing is fun and will continue however
given the higher end nature of the hardware I would expect so. Overclocking something like a g3258 paired with a 270x the gains could be the difference between 50 and 60 fps. Or, more importantly, the overclock could prevent frames from dropping below thirty, making games more playable or playable at higher settings. Often times I think the benefit of overclocking lower or mid range hardware isn't the upper end benefit but the boost to minimum framerate, etc.
If you dont need the frames though, there is no reason to OC
on the witcher 3 I noticed a huge performance impact when my CPU is oced, but most other stuff, it doesn't mater.
Depends on what you do with it.
And in terms of gaming, which games you play and how you play them.
Upgrading from a 3570K to a 6700K can be a huge improvements in certain area's.
when games are run at 100+ fps upgrading mobo+cpu for 10-20 fps is not worth it, considering a 20fps upgrade costing around $500 bucks.
yeah... it seems like OCing doesn't help as much in terms of gaming performance now. having an i7 might be a good future-proof option though.
Yeah but there is more in life then just gaming.
Video editing and rendering for example, virtualization or other productivity stuff.
Haswell and Skylake cpu´s have some additional instruction sets and feutures which are realy usefull in certain area´s.