I feel like most people overspend when it comes to their CPU. I know the allure of trying to get the latest and greatest and fastest part, but for gaming, I think many people overvalue the need for a fancy/ expensive CPU.
In all honestly, for purely gaming on a budget, you could probably get away with an AMD Athlon X4 Quad or an Intel Pentium G3258.
I have an AMD FX-8350, which I got for it's superior multi threaded performance when compared to the i5 3570K (which cost $20 dollars more) at the time. The 8350's single threaded performance is only marginally worse the the 3570K, with its multi-threaded approaching the then i7 territory. I primarily game, but also do some video editing. Despite the 8350's single threaded disadvantage, it still games very well with a GTX 770. I can play most games at or near max settings at 1080P.
I probably could have gotten away with a hex-core FX chip or perhaps even lower. The good news is that I won't have to switch my CPU anytime soon, the 8350 will be more than sufficient for the next 4 or 5 years, and I can just upgrade the graphics as need be (however the GTX 770 is still more than sufficient for my uses at the moment.)
People build these monster rigs with i7 4770K's, and I am just left wondering why they felt compelled to spend that much on a CPU, when an i5 or even an older AMD chip, with somewhat outdated architecture, would still play games at around the same frame-rate and detail settings with and identical graphics card.
A i5 4460 and a cheap H97 mobo is the best bang for buck set up for gameing on the cheap. Amd is sorely dragging behind with the cpu's (still useable) and it this day and age when games are starting to use 4 cores a dual core Pentium just isent cutting it.
I am aware that AMD is dragging behind, they haven't really come out with anything architecturally new since 2013 (or arguably 2012) but the 8350 was at least somewhat competitive when I bought it 2 years ago.
This is one thing I've been wondering about. I was looking at the full skylake lineup and thinking wouldn't it be nice if somebody benched them all. For example take them all with 3 different gpu points(980ti, 970, 950 or fury x, 390, 370) and see how they game. It would be nice to see what games need 4 cores, where dual core hyper-threading works well, and where dual core is all you need.
For example look the the i3's there is a $40 price difference for .2ghz and 1m cache(6100 vs 6320). It would be nice to know how much different they are and if the 6320 is worth it over the 6100. Also the i5 6400 has 4 cores but is 1.2ghz less that the i3 with a $30 price premium. it would be nice to know which of these price points are better than the others.
Also related to the original question quad cores do get more longevity that dual cores with more games using more cores and some games needing 4 threads to run. Here is a video i found with a i5 2500k vs i3 6100. So getting better pays off more farther down the road.
Yes but it still demonstrates the 2500k still holds it own against a new i3 with it's new ram too. So depending on how long between builds and you budget it may be worth it to get the higher end part.
A lot of people want to say get the newest! You'll never need to think about it again!
For me I am fine with older hardware. I'm running a final gen phenom X4 955 at the moment in fact. I can stream, I use KDenLive for video editing so most of the work is piped to my GPU, and in general I have a good experience. No bottlenecks on my 250X or anything.
In the future I could do with an 8350 for a bit faster workload and to improve steaming performance (and I do have it planned that way) but over all a lot of people blow the need for the best up way too much.
Depends on the games you play... If you like Bethesda games or Star Citizen, the best CPU you can get will benefit you, with that said most people do go overkill.
If you're playing csgo and TF2 or hell even BF4 you aught to be getting something mid tiered not the best
Personally I'm running a 4690k @ 4.75 ghz and I find it's limiting.
I used to have an 8350, upgraded to a 5820k. Why? Because Assassin's Creed 4 was running at 15fps on the 8350 at 1080p. Was an 5820k a bit overkill for just that? Hell if I care, I could afford it and it fixed my issue. Also wanted the x99 platform so I could cram loads of ddr4 goodness in the future when I can afford that but that's a little off topic. Basically what I'm saying is it depends on what games you play, what you can afford and what you are willing to settle for.
Defiantly. Especially with more games taking advantage of more cores and games like RTS that need more cpu power from the get go. Yes your games now may not need it but who knows what your going to be playing in 3 years.
I got a 4790k because I never wanted my cpu to be the issue for years to come. If i get a flagship class gpu(which I plan to do) I don't want the cpu being the weak link.
As far as Assassins Creed goes, the last AC game I got for PC was Black Flag, and it didn't run particularly great. Assassins Creed games are typically not optimized for PC, so I pretty much stopped buying them. The majority of games don't need a 5820K to run, but I understand where you are coming from. If I had the cash, I know I would have something better than an 8350.
Ram speeds dont matter much in gaming with a dedicated gpu. But a Quad core is definitely a must have for some games. A true Quadcore is all you realy need for gaming atm.
In my own personal experience I've seen an absolute max of just about 50% load on my E3 1230 V2 strictly playing games. My GTX 770 is having a much harder time during gaming than my CPU. I'm thinking that something along the lines of a i5 2500K would probably still be fairly adequate for most* modern games unless the user is the type that needs that last 3 FPS for whatever reason. For more recent CPUs, the i5 4460 is pretty hard to beat in price-to-performance as @cooperman said.
*Really depends on the specific game, the engine it uses, drivers, moon phases, etc.
Now if we asked this exact same question, say five or six years ago, the adequate setup probably would have been a dual core with like 4 GB of RAM and a 512 MB GPU, 1 GB if you wanted the high end stuff.
Personally I will always get at least a i7 quad core as long as I'm working with DAWs using CPU hungry plugins. If I was just building a gaming PC I'd just get an i5.
And as much as I 'd love a 980ti, I can't see needing to replace my 660ti any time soon.
With the inflation of cores, and the inflation of GPU RAM, we are seeing a crazy race for higher numbers.
However, hardware is still doing a slow transition from only needing two cores for your CPU, and 2gb of RAM for your GPU - it is when you progress beyond 1080p that you will start to require more. But, ironically, at the higher resolutions, the CPU becomes less of a performance factor than the GPU.
As things stand with the relationship between hardware and gaming performance, most people would be perfectly fine with the performance of an I3, if their only objective is gaming.
Yep. And for the next 2-3 years probably. I have an i5-4590 that I got for 150eur and I couldn't be happier with it. Ability to OC would be nice, but I don't actually need it. It's just nice, that's all. Every game I play runs great and if it doesn't it's because I could use a better GPU not because my CPU isn't good enough.
currently in my laptop i have a i7-4710HQ cpu and that for me still isnt fast enough for games like minecraft and cities skylines. I agree with the fact that you dont need more than four cores for gaming at the moment, but single threaded performance is still a big issue