No it isn’t. Actually it’s a possible 60 FPS. The “focused” viewing angle is 60 degrees as by this chart. Yes we can “see” 180 but not in focus.
Human Eye FPS: How Much Can We See and Process Visually? (healthline.com)
No it isn’t. Actually it’s a possible 60 FPS. The “focused” viewing angle is 60 degrees as by this chart. Yes we can “see” 180 but not in focus.
Human Eye FPS: How Much Can We See and Process Visually? (healthline.com)
you can still see even if not in focus
The point is it’s in your peripheral vision so do this experiment. Open a Newegg page or something with “box” ads. Look a the screen and try to read anything outside the 2 ads directly in front of your eyes, center screen. You will be able to “see” other ads but not read a word.
Another point is my statement is not patently false. It’s true, a fact, and while you can visually see there’s more on the screen at say 4K, you can only focus on a 60 degree area. You can’t alter the size of your eye’s lens. Unless your the Steve Austin, aka the bionic man. Just like we can see the 160 FPS but not actually process the details of every frame.
I didn’t post this to create some argument. It’s scientific facts. These graphics companies have us all fooled into more resolution, more FPS because they know no one to few will ever read any facts regarding the human eye. They just want to sell us the latest and greatest most expensive hardware. All about the profit these days.
The brain is easily fooled by our eyes. No one has to agree, you can say I’m dead wrong even with knowledge going back 10 years ago from my CCTV experience. People dispute things that are fact, like say the Earth is flat, when it’s round all the time.
Like blurred images on a monitor but adding Freesync stops it by buffering the frames so it’s smoother to the eye. The refresh rate can handle the higher FPS, our eye can’t. Screen tearing is different, that’s refresh rate not up to the FPS.
When I play BF4 at 1080p on my 32" screen, sure I can “see” the dude on my left about 30M away ready to shoot or I assume that’s what the intention is and respond by pre firing if my mini map shows no friendlies. Now the further out in front something is, you can see it clearly as it falls into that 60 degree cone.
Some other things come into play like how far away the screen is determining the best FOV. The overall size of the screen and more. There’s a whole article on that subject as well. I searched it to find the best FOV for BF4 since the PC version has the best set for an Xbox viewed on a TV from 8’. My screen is about 3’ away.
What’s the Difference Between Ray Tracing, Rasterization? | NVIDIA Blog
That’s the article about raytracing made in 2017 by Nvidia.
A final note. How much realism do we need in a game? There’s enough players out there that “live” their game of choice like it’s reality as it is. With any of the games I play, I don’t have time to see the treads on the skid mark, the scratches in the paint from stones, or the whiskers on the dude that I’m about to shoot. Some effects are great but there has to be a limit before gamers lose touch with the real world altogether and spend life behind the screen or VR. Remember it’s mostly kids playing adult games.
You realize you dont need to focus on something to see it then adjust focus to engage that next. Might as well only have that tiny field of view in a car then too since its pointless.
Says who, no one i know is like OMG i need 4k or 8k games now!!! Most people play at 1080p still with 1440 just taking off and honestly unless you bigger then 32" you dont need 4k for gaming.
You 100% can tell the difference in how smooth something looks going from 60 to 120+ FPS
Have fun with your opinions but these are things that 100% make a difference, thats not to say there isnt diminishing returns or very little return from person to person. But your statements do not prove true, from my expierence but you know
Yeah, there’s no question about it. Even 60 to 75 is a huge improvement.
Though I think it’s more about frame consistency improving.
My secondary monitor runs 60, and when I have a migraine setting in, I get the “peripheral vision flicker” on that before the sides of my main monitor (75, Ultrawide) start flickering.
If 120Hz monitors were cheaper, I would throw out my two screens over night.
I really did not notice the jump 60 to 75Hz…
What I did notice is when my framerate dropped outside the Freesync range.
People, Variable Refresh Rate is more important technology than 240Hz or whatever…
240 is way past DR honestly 120 is the sweet spot, 144 is just what they tend to have, you wont see me going out to get 244hz or w/e is on the market.
I can’t speak for you, and I will not accept someone telling me what I am capable of, I can perceive a difference between 30, 60, and 144 FPS in my own experiences.
Can I see every individual frame and tell you what changes one to the next? No of course not they are changing at 0.00694 of a second at 144FPS/Hz. But can I tell that there is a difference between what I experience at 60FPS and 144FPS? Yes absolutely. It is far smoother visually and much more enjoyable.
What works for you is fine and if you want to believe everything you read at face value that’s up to you but don’t tell me I cant see more than 30FPS because in my experience that is just beyond stupid sounding.
So when you drive and catch something in your side vision, you don’t need to turn your head? I do and I raced real cars on a real track, proves I have reaction time higher than most. But that focus change happens nearly instantly. Point is you refocus, which was my point. 60 degrees, refocus to clearly obtain your target or approaching vehicle, apex, whatever.
We’re talking about a screen but it doesn’t mean to not have all the graphics in focus because we will move our eyes to whatever is on the screen we wish to see clearly.
When I refer to resolution, I’m talking about texture detail not the size of the pixels. At 1080p playing FarCry 5 on ultra I can walk up to a dead “Peggie” and see every beard hair, the brains, etc. How much more detail is needed? So then Nvidia comes out with raytracing, borrowed from the motion picture industry that used it way back in the '80’s to make film special effects look real. Rasterization does a decent job, I don’t think it’s worth the performance hit or the extra $300 or whatever per card to have that.
There are plenty that play at 4K, not 8K as that’s still emerging. Sure 1440p is more popular ( hardly just taking off) compared to 4K, but competition players use 1080p for the most part. The extra “eye candy” just doesn’t warrant going to a more expensive monitor, especially for over 144Hz, like a 240Hz unit. My Sceptre at 185Hz 1080p is fine.
If you can visually tell smoothness from 90-120 than the game is probably CPU bound and lagging. What you “perceive” in a game as “smoothness” you’re basing on FPS and that’s where your brain is fooling you based off your eye’s input. Ever here of “slight of hand”? Works the same way but no rabbit ever came out of an empty hat.
The eye thing is a scientific fact, not an opinion. Ask a doctor, professor or movie producer. Every movie you ever watched, big screen to HDTV was never over 30-40 FPS. Possibly the 1st movie Stallone starred in “Death Race 2000” from the early 70’s where they speed the camera up and repeat the same spin out over and over might exceed that.
You’re right at 47.5 years on this planet, I learned a lot, “you know”.
What is the point. you cant see more than 30…
Movies are filmed at 24FPS, and the reason is because film was expensive and that saved on costs while also being “fast enough” to still perceive it as motion and not a series of still images.
In modern times we have kept that convention as we are used to it and expect it, Some TV shows are shot at 60FPS and suffer from what is called “soap opera effect” where it looks “wrong” or cheap because it is not the 30FPS of normal broadcast TV. Don’t mistake that as a contradiction, Movies are typically 24FPS and TV typically 30FPS. (EDIT: Caveat; I am in Europe where out power grid and thus TVs traditionally ran at either divisions or multiples of 50Hz which again is a hold over on modern digital TVs, the US will be 60HZ and thusly typically 30FPS or 60FPS for broadcast TV)
They tried in recent time with The Hobbit to try “high FPS” movies in the cinema with both a 24FPS and 48FPS version and movie goers could see the difference but did not like it as it looked “fake” or “too real” which is an odd contradiction but what was reported. so they continue with the 24FPS as it is just what is expected.
Just leaving this here to shut everyone up.
By the way nVidia sucks. No news there.
It may sound stupid to someone who is ignorant enough to do zero research and not take what I said as it’s a scientific fact, not some made up baloney. Not said to anger or insult you or to receive insults back.
Ask a movie producer, your eye doctor, a professor, the answer is the same. The visual appeal is part of the lie your brain tells you based off ocular input. Much like slight of hand is how it works.
You can see things faster than 30 FPS but like you admitted, you can’t recognize every element of the frames. I wouldn’t call anyone “stupid” and doing so is “stupid”.
This is clearly going places… I will leave you to your fantasy. Good day.
STOP, NO… Games are not movies, no matter how many companies are trying to convince you of that.
You need the responsiveness, you need the fast reaction time, you need the high refresh rates. Stop it. This has reached the point where people are going down to 24 FPS saying how that is enough. No it is not, because you are not just watching the picture. You need to react, you need smoothness, if you want immersion you need extreme smoothness. Why does SONY mandate 90FPS on it’s VR games if you can do with 30? Because you can’t do with 30…
For fucks sake, stop it…
I’m sorry I have to address this.
So on the internet you can’t just keep repeating “it’s a scientific fact” without citing your sources and providing some academic backing to your statement.
Exactly like this 2009 LG has some built in “life like” option that’s supposed make TV look like you’re on the set. It looks bizarre with it on as it processes frames to compensate for ones not actually in the original. Much like “fake” 5.1 Dolby Digital does to audio but they get it right most of the time. That TV sits collecting dust.
I did provide 2 sources in links above. Sure one’s a Quora. It was a very quick search since this “attack” on my intelligence just started. Just look up FOV and there’s an hour or 2 video on how it’s calculated for games in particular.
I don’t make baseless claims like some politicians do. The “kraken” isn’t here.
I notice a DRAMATIC difference in basic things like watching WWE Wrestling in 24/30 FPS and 60 FPS.
I refuse to watch anything < 60 FPS now for Pro Wrestling.
The fluidity matters.
More fluidity obviously matters, but there are diminishing returns past 125 Hz
There are “Ideal Refresh Rates worth targeting”
But every time you increase, there are diminishing returns.
That diminishing point starts at 125Hz -> 160 Hz
At that point, you should balance other factors like Wider Aspect Ratios, HDR, Graphical Details in games, etc.