Framerate Wars

ok could someone answer this for me....

people are always arguing in FPS that they can tell the difference between 30-60, 60-100, personally i can tell the 1st one only, so why when going to the cinema or on tv at 24/30fps can video get away with it being so low, cgi still looks silky smoth, but playing crysis at 24fps would make you wanna gouge your eyes out with a cd writing pen.

 

i know that tv's have htz which can double or tripple the frame every second to make it look smoother, so why cant your gpu just output 24fps on a game and the monitor do the same? just double/tripple the image to make it smooth, i know that on old CRT's you could go higer to 120, and these new 3d flatscreens can now if thathas anything to do with it??

Thanks in advance.

Well 23.976 fps TV video isn't smooth, you're just so used to it that it seems like it. As far as games go, because of the interactive factor of it, you are much more aware of lag.

I heard something about the industry doubling up and going to 48 FPS movies. After reading this article, apparently also 120 FPS movies. I would so love it if all video went to 120 FPS. I mean I know anything above 60 is extremely difficult to notice a difference, if any. But still, a little redundancy couldn't hurt, except for file sizes maybe.

http://www.extremetech.com/extreme/128113-why-movies-are-moving-from-24-to-48-fps

The following post is all based on my personal experience!

Is there a difference between 24 fps and 60? There is. Not only the number of frames that pops up on your monitor, but also the smooth gaming experience, in which you can only achieve with a stable fps.

Since the NV 301.42 WHQL driver, my GTX295 is starting to show some strange bugs and performance drops - specially in StarCraft 2 where the driver crashes everytime you open the game. It's is still playable, but not as good as it once was. To the point - in SC2 you can hold your mouse over the menu button (upper left corner) and it'll show you the local time and your fps. And from my previous experience (when I had NO problems with video the drivers), the fps counter in SC2 is pretty accurate.

WHEN I SEE SC2 STRUGGLING (~22 fps) THE FPS COUNTER SHOWS 40-55 FPS. And no, it is not caused by the driver crashing or the fps counter is inaccurate. I've experienced this in so many games, and so many times through my gaming enthusiastic life: Crysis, Fallout 3/NV, Half-Life 2, Borderlands, League of Legends, and other titles I cannot remember right now. I don't know if I am in possession of some kind of supervision, but I can clearly see a difference between 60 and 30 fps. If a smooth fps suddenly drops, I'll notice.

My dad upgraded his TV from a 1080p 60hz to one of the newer 1080p 200hz with LED and all that stuff. I've spend many hours with my dad watching Formula 1 on the old 60hz TV, and you know it was cool and all. But when I first watched Formula 1 on his new 200hz LED screen I was totally blown away. The frames were just so damn smooth. I've never seen anything like it, not even my own 60 fps gaming experience could match up with that. SO YES, there is a minor difference between 60 and 200hz.

These are just few examples of what I've experienced. But for me, there is a difference. People who say "YOU CAN ONLY SEE 24 FPS" are full of s***. Even cutscenes in games that are locked at 24-30 fps seems to struggle for me. If you see a 60hz, 75hz, 120hz and 200hz screen next to each other playing a Blu-Ray video, you can tell the difference yourself.

Just me 2 cent.

The only real way to play games is at 640x480 at 18.37 FPS.

301.42 Is dildos.

 

I crash on desktop watching fucking youtube videos.

You can tell a big difference between 60hz and 120hz. Some people say this is all bullshit and I tell you, it's not.

Go use a 120hz monitor for a day in a game that you get constant 120 in and allows for 120 refresh, then go back to using 60hz. You will not want to.

Television and other medias of that sort compensate for the low frames with motion blur.

... 48 fps is 24p in each eye for 3d... 

24p is the standard for cinema and probably always will be. It's got a certain look and feel, and yes, it's perfectly smooth...

As far as gaming is concerned, any less than 60 is noticable because the game simply runs worse at that point. It's calculated differently.

so much misinformation in this thread