30 FPS + Motion Blur vs 60 FPS?

I have been playing with ultra details at 1440 on my rig but due to my card not every game can hit a consistent 60 FPS. Since constancy in FPS is more important than not, I have played with locking at 30 with motion blue and for my eyes it helps not want to throw up at the pleb tier gaming experience. Since to me, I can deff feel each frame like a slide show the motion blue deff help prevent that, and since 30 FPS is easier to run at higher graphics settings I can maintain a consent FPS.

I am not playing online first person shooter games so the in-accuracy doesn’t matter.

My question is, how many here have tried the same thing? Please remember I am not running an adaptive sync monitor so anything other than a FPS divisible by 60 will cause tearing (aka, half 30, or full 60).

Also, in not competitive games, would you push the graphics and cap at half your hertz as well if your card was bouncing between 40 and 59fps or would you lower to medium details to hit 60?

If i had a 4K panel I could run at 1080p and keep high and ultra settings but since 1440 / 2 is 720 I feel i would be sacrificing a bit too much in the resolution department.

Anyway what are your thoughts comments, questions, and concerns?

IMO motion blur is cancer, i would dial back the settings for 60fps any day. Start with Anti-Aliasing as it makes little difference at higher resolutions.

8 Likes

Yes I would.

5 Likes

I usually disable motion blur, and I run games in displays default resolution with high fps, and drop settings to medium. This applies pretty much in all PC games, competitive or not.

When I have to lower the settings too much I know I have to consider upgrading GPU.

On side note, I moved your thread to Gaming category. :slight_smile:

3 Likes

Yeah, I honestly turn most of those after effects off (most of them are hell on my bad eyes) and prefer turning down the settings to get smooth framerate. It might be nice to turn on for screenshots but during gameplay dof changing dynamically when a lot of games can’t even get the camera working right in close quarters just seems silly.

1 Like

I imagine that if you can’t get a smooth 60 then locking it to 30 would be the next best thing as having it jump around is much less pleasant then having consistent frame timings. Won’t help with the input lag though.

2 Likes

I hate motion blur and it isn’t a replacement for frame rate. It’s an effect to make fast moving (in game) things look like they’re fast.

Given the choice between 30 fps plus blur and even 45 fps without it, i’d try for the frame rate, even if you have to drop other details to do so.

I disagree that locked 30 is superior to say, 40-45 or even 45-60 as the machine is capable of. 30 just looks and plays… bad. But that’s my 2c.

If you’re talking sub 60+ fps, definitely try for frame rate unless it really is a game that doesn’t need accuracy or whatever. But for me, even lower detail graphics look "better’ if they’re a lot smoother than 30 fps.

Most games these days you can get massive performance gains in terms of FPS for not a lot of noticeable detail loss. Go look up the Doom (2016) image quality vs. settings to see just how much you can get back on a typical modern game…

edit:
then again i say this as a gamer from the 80s-90s who grew up with sub-teen FPS in wireframe or flat shaded polygon land on early 3d games. so i can deal with low res, low poly count, low detail, etc. But frame rate just helps gameplay and sense of immersion (for me) so much.

I guess for me, no game graphics look “real”, so they may as well “move realistically” instead (even at the cost of significant detail loss) for the immersion factor - if that makes sense?

Same reason i have little interest in 4k gaming just yet. 1080p is “good enough” unless you’re on a massive screen at short viewing distance. Having the frame rate 2-4x the speed is imho better than 4x the pixels… even if you’re talking say 4k60 plus vs. 1080p at 120-144 fps

2 Likes

A few points…

Some games are fine with 30fps while others need 60+.
You can make a custom resolution with a 50hz refresh on Nvidia (no idea about AMD) if you can’t quite make 60hz. 50hz locked is always better than 60hz almost locked.
I’m old skool so I tend to just setup all games with the lowest visual fidelity that, in no way, disadvantages the experience.

Depending on how good the motion blur is and what type of game you are playing, it may be “good enough”.

For myself, I prefer high visual fidelity to frames. Story and campaign games, I can live with 44 to 75FPS (Freesync, F yeah!). I need high shadow quality, I can’t have grass be drawn in where I notice it. It needs to be perfect.

Then there are online games. In online games I would drop to medium before trying locking to 30 FPS.


One exception: Racing games.
I play racing games settings cranked, FPS hopefully in the high 60s or 75 and medium motion blur for the sense of speed.

Most modern games still look good at mid settings now days. I would.