CPU Requirements for high FPS Gaming?

I know this might not be the typical question, but here goes anyways.
My current i5 is not cutting it anymore. I recently upgraded to a 1070ti and my i5 is bottlenecking.

Now the fun part. I’m 99% sure the CPU is limiting when it comes to high Framerates. Let’s take Overwatch as an example.
With every setting maxed i get around 120 FPS (though massive dips which make it unplayable. I haven’t figured this out yet, but i’m guessing CPU too). Turning every setting to minimum get’s me 140-150FPS. So clearly, i’m at a point where my GPU bottlenecks my GPU.

When looking for a replacement, i’m not sure what to look for. Most benchmarks and tests run new AAA title at max settings or even 4k. But this typically gets you around 60 Frames, so far outside of the Range where i’ve seen CPU Bottlenecks.
So, lets say i want to hit consistent 300 FPS (the max Overwatch supports), what do i look for in a CPU? I now i don’t need 300 FPS locked. I just want to figure out, which metrics are important when looking for a CPU specifically for high Frame Rates rather than high visual fidelity at 60 FPS.
Is Higher Clock Speeds all that matters? Core Count? or IPC?

Since i’m looking at AMD at the moment, would a 2600X get better performance than a 2700 because of the higher boost clock? How does this change when i add a Browser and, let’s say, Discord into the mix? Would an i5 9600K perform better because of the higher Boost clock, again, or is the similar Base Clock more important?

Are there any Benchmarks that explore pushing the GPU with high FPS in competitive Games rather than going with Max settings everywhere?

Unfortunately there is more to high fps than just how much hardware horsepower your system has. Some game engines just won’t really handle 200+ fps at all. You could do much more with software optimizations… eh it’s blizzard game nvm…

Not knowing what i5 you have, it’s hard to say if it’s the one that keeps you back.

Task manager/system monitor would tell, if your cpu is all the time over 95%, your cpu is keeping you back.

Ryzen 3000 comes out 7.7 so i would wait for it just because the ipc uplift and great clocks.
Also some 8600 non k cpus from intel might be good choise. i have no idea how 9600 non k does.

About gpu, we absolutely need to see if 5700 series is any good at all. cant say yet.

i would not buy nvidia gpu, but for windows user i would recommend gtx1660.


Honestly you dont need to feed your monitor more then its refresh rate, unless you are an actual competitive gamer thats on a team or trying to get on a team. Dont waste your time trying to hit max fps targets it wont make you better at the game.

If your goal is maxing the fps you pretty much would want an nvidia gpu because to hit the targets you would want the highest end gpu you can get and unless your budget is super low (not really someone who is going 300+ fps targets) you would be in the nvidia gpu range, depending on how well the new XT and what not actually turn out to be.

Oh yeah, i get that. I’m 100% sure though, that Overwatch is capable of running at locked 300FPS as Several Streamers, most “pros” and two friends of mine are running it like that (lowest settings, 75% render scale).

It’s an i5 6500. And yes, it’s pinned at 100% while in Game. The Taskmanager usage percentage for the GPU seems to be off, as it’s always sub 20% for me no matter what game. HWMonitor is reporting it with 80-90% at high settings and sub 50% with lowest settings. So i’m pretty sure it’s CPU related.

Definitly. I’m just trying to make my mind up, wether i need cores, clock or what. I’m getting a new CPU and would prefer a new Ryzen. Additionally, i’d like to only Spend what’s needed. So, if Higher Clock speeds are most important, the lower end Ryzen 5 might be better for me than the higher Ryen 7’s while also saving me money.

No new GPU needed i think. I’m happy if i can get 100% GPU utilization at lowest ingame settings. If this equates to 170, 250 or 100 FPS isn’t too important. Plus my 1070ti should be more powerfull than a new 1660 anyways.

There is more to it than the refresh rate of my monitor. For one, with Overwatch at least, it’s proven that higher Framerates decrease inputlag considerably. Other than that, it isn’t about the number at all. I just want my GPU to be utilized 100%. And in Overwatch, this means hitting high Frame Rates for me as it makes a difference in how the game feels to play.
Also, having considerably higher FPS than needed will ensure you hit you desired Framerate in the most intense Situations. I have a Gsync Monitor and atm my Frames drop from 120 to 40. In those situation the Games becomes a slideshow and is unplayable. Not because 40 FPS is so bad, but because of the massive Drop.

Finally, targeting what i said will leave me with spare resources for Browsing or other Applications while Gaming. I’m not a Pro, i don’t earn my money with Gaming. But i am competitive and would like to have the best experience i can. No, it won’t make me better, but i’ll enjoy playing a whole lot more.

Intel 9700K / 9900K.

The fastest cpu’s per core, and the best overclock ability generally.
Although overclocking is always a matter of luck off course.

That’s the question.
I7 9700k is 8 cores with 3.6 Base and 4.9 Boost.
A 2700X is 8 cores with 3.7 Base and 4.6 Boost.
Both overclockable. The i7 is 100 Bucks more. What makes the i7 faster? On paper they should be really close. Plus i’m not to keen on paying for Performance i wouldn’t need in the end. I realize that this is a tough thing to say beforehand. I’m just trying to get an understanding for which factors are important for my goal.

Links, you monitor can only draw so fast, you can only react so fast. Thats like saying 10000 dpi mouse is the best because its the most senstive.

Not very hard bump resolution above 1080p or higher detail. Most people who are aiming for highest refresh rate are running at low.

AMD cpu’s tend to have better 1% lows check out Hardware unboxed on this and look at their benchmarks

(Maybe not at the top tier, but would prob wait to see ryzen3xxx vs top intel)(was remembering their latest 2 year old cpu battle)

1 Like

https://www.reddit.com/r/OverwatchUniversity/comments/4rx36a/a_guide_on_framerates_and_frame_latency_why/ Explains it pretty well. I realize that this isn’t a scientific study. If you want any “proven” benefits i’ll have to pass. I go by feel on that. And i can definitly feel the difference between 100 and 150 FPS, even though my monitor can only show 144Hz.

I already play at 1440p. And higher details isn’t the goal here. I specifically said i’m looking to increase GPU utilization at lower settings with higher Frame Rates.

Interesting. I’ll read up on this.

See my update was on older cpu battle, but would wait for Zen 2 chips to drop as they are hitting pretty decent clocks

Yeah, i’m on the edge of my chair for ryzen 3. Either because of the improvements, or because of price cuts for the Ryzen 2xxx and potentially Intel 9000 Series Chips.
I’m just trying to work out which of the new chips is the best for my usecase. I’m not made from money, so getting the best bang for my buck is something i care about. If this means shelling out 400 bucks for a ryzen 7 3xxx with more cores, i’m fine with that. I’d just want to avoid spending money on cores i don’t need or Clock Speed that wouldn’t benefit me in the end.

Wait, is this about CPU’s or GPU’s? The link seems to be about Graphics Cards, which i’m 100% not going to replace.

If you go Ryzen 3 would prob just go x470/B460

Linked wrong thing this was the review that I remembered

about the better performance, was looking at gpu reviews to see if there was a difference at that tier too

That was the review not really fair, but a look on aging, buying at top intel tier prices not really releivent at the core counts we are in now imo

Interesting. So running at close to max, the Intel CPU suffers in consistency.

In the end, i guess, it’s me getting the highest clocked Ryzen 3000. Since i rarly only have a game open, but do some multitasking, the additional cores should be nice. I don’t see a use for a 12 core CPU though.
While the 9700k and 9900k would probably be better overall, i don’t think the new Ryzen CPU’s would bottleneck my 1070ti. And paying 100 or 200 bucks more overall for not much real world benefit isn’t something i’m looking forward to.

I guess i’ll have to wait for the first reviews and benchmarks to see how the new Ryzens stack up.

It really depends on the game. First person shooters are typically not very CPU heavy. Games like StarCraft where you have hundreds of units running about is CPU heavy. Unless you are monitoring your hardware it is difficult to say whether or not the CPU is being fully utilized. Try CPUID. Not the pro version just the standard. Run it while playing and you can see how much is being used.

Ok, am at home now, so i thought i’d give you all some numbers.
I’m not running intensive Benchmarks or anything. Since we’re talking Overwatch as an example so far, i’ll stick to that. To keep it compareable, i’m standing in the training area at spawn without moving anything, same hero every time.
I do have a Browser (Brave) and Discord open. The rest is my System as i always use it. So some Steam in Background, Logitechs Software etc.

All tests are 1440p and running Exclusive Fullscreen.

  1. All Settings at High, Epic Texture Filtering and Ultra AA. 100% Render Scale
  • 133 FPS
  • 100% GPU usage
  • 80% - 100% CPU usage
  1. Same as above, but 200% Render Scale (so effectively running 2880p)
  • 41 FPS (much more stable, but feels really laggy).
  • 100% GPU usage
  • 35% - 40% CPU usage
  1. Lowest possible settings. 100% Render Scale
  • 170 - 250 FPS (really unstable. even while doing nothing.)
  • 100% GPU usage
  • 100% CPU usage
  1. Same as above, 50% Render Scale (so 720p)
  • 170 - 250 FPS
  • 50% GPU usage
  • 100% CPU usage

Sadly, i have no way of conveiing Frame Rate Variance here. Also, those are best case scenarios. you can subtract around 50FPS when in a real game, and add a whole bunch of variance when teamfights start to happen.
A major issue seems to be falling in and out of GSync Range. So when my Frames Dip below 75 (i think this was the lower G-Sync limit) or go above 144, Stutters and frame drop happens. Being consistently above or below isn’t that big of a Problem. Being consistently in that range isn’t too.

Number 1 are my current typical settings. Both Low options yield higher FPS, but i can’t maintain those and dipping below 144 FPS triggers G-Sync and seems to introduce stutter for a short time. In a real game with Settings from 1. I see variance down to 40 FPS. 130 is typical when not much is happening.

To be honest, i’m not sure what to make of this. In most cases but the Extreme, my Hardware seems to be utilized 100%. Maybe the Problem is all the other Stuff a PC does that causes all this variance. From the Data i wouldn’t call this a typical CPU bottleneck. I can see the browser having a larger impact on CPU though when it’s pinned already. The GPU typically isn’t used by other Applications that much.

Oh yeah, all the usage metrics are from HWMonitor. If you feel like i could test other things, i’d be happy to do so.

  1. CPU is on edge of max
  2. GPU bottleneck
  3. Max on both = peak efficency (not sure how you go for high 100% at 100/80-100 to low 100% 100/100 should be easier test no?)
  4. CPU bottle neck

Looks nicely balanced at 100% Render Scale. The frame drops could come down to the game having difficulty running as such a high frame rate. If you could manually lock it at 150 or 200 it may be a smoother experience.

1 Like

Yeah, i though about limiting it to something withing G-Sync Range. So 140 or so. I’ll test this in a live game when we play later.

My idea was, that, since my CPU is at 100% in most “normal” cases, the frame rate variance is induced by all the other Programs on my PC. So, i “higher end” CPU would yield similar FPS while being more stable as it’s got enough juice for the Browser etc.

You CPU is a bottleneck here. Opposed to all other Blizzard games, Overwatch wants cores and clocks (Hard FPS cap is at 300).

Graphs from the video:

Try scenario 1 again, but at 90% render scale and set all the FPS locks (144, Vsync, G-Sync). That should make it feel “smoother”.

As for CPUs to upgrade to: See how the Ryzen 3000 does, then you can allways grab a 6 core (or more) Intel if they are still faster.

Input lag and so on is measureable at this point. Human nerves are just not fast enough for <50ms to make ANY difference.

1 Like

From your example it looks like a higher end CPU will mostly benefit at lower graphical settings.

Definitly not true. As a guitarist i can 100% tell the difference between 10ms and 20ms monitoring latency. Sub 10ms is basically real time. Anything above is workable with some adjustment, but definitly noticable. 50ms throws of my rythmic timing.

I agree on the rest though and appreciate the Video. I’ll watch through this now. The most interesting thing is, how high the Variance on all of those Tests is.