Linus's 240hz claims. true or BS

Any reason not to link it? He’s got a lot of videos about FPS and monitor refresh, would be nice to let everyone know which one exactly you’re talking about.

1 Like

Not sure how you can claim “Blatantly False” when it where tests with actual people in actual in-game engine. He didn’t just state something. He had 5 People of varying skill there testing anything from pure reaction to ingame testing.

I also think you didn’t quite get the result of the Video. They clearly stated:

  • Practice and Skill will always SIGNIFICANTLY outwhey hardware benefits. All Players stated you can play to the highest standards on 60Hz if you’re used to it and train on it.
  • Higher Refresh Rate had a MUCH bigger impact on casual players. If you are really good, it’s much easier to addept to lower refresh rates.
  • The Refreshrate isn’t that important. They tested 300FPS on 60Hz and have shown that this has significant impact. Granted, they only test CS:GO, but the lower Framerate seemed to increase overall lag, no matter the actual refresh rate.

So yes, from their testing, higher refreshrate is a benefit in the games and szenarios they tested. It’s more relevant the less skilled you are and spending time with your setup will always improve you more than better hardware. Also the jump from 60-144Hz is significantly higher than 144-240Hz.
I’m not sure where any of this is “Blatantly False”. It’s all reasonable and close to what i expected.

6 Likes

It’s worth a watch

1 Like

I was just asking @HEXiT because I would like to know where exactly

the 240Hz.

I’ve seen those videos and I don’t remember that.


Aside from that, like many here have already explained, the statement

is of course incorrect because the game does draw those frames.

1 Like

While concerns about a tick rate are valid, I would propose instead that the FPS should be AT LEAST DOUBLE the tick rate, otherwise you are guaranteed to run into pacing/sampling issues. Whether it’s noticed by one person vs another isn’t a concern of mine.

1 Like

no the game doesnt draw thoughs frames . it draws a frame its already rendered from previous packet data.

you saw thew vids but dont remember… LOOK AT his conclusions…

but you are wasting my time as your obviously guessing and have no idea how programs are written for multiplayer.

Well, at least I have a rough idea how animations work. :rofl:

Cya.

no mate your not because you have something packet duplication. or packet dup. and this is why you dont need to double the data rate.

this will re-transmit dropped packets which means you wont have to send the entire packet sequence again.
but thats another thing :wink:

well done such a great example of the users in this place.
of course i know how gfx are displayed unlike you i used to program which lead me to my question…
you on the other hand answer questions you have no actually knowlage about…

good for you!..

Even if 240hz doesn’t offer any real advantage to gaming it sure does look nice for day to day stuff.

Personally I think there’s a whole lot more to it than the network aspect. Your own movements mean what’s drawn on the screen is smoother and so you can pick out details easier. It might not make the movements of enemies more detailed but when you’re trying to snap your aim to someone’s head, that head is drawn more clearly while your screen is moving.

Finally I’ll say that this thread is turning out to be pretty hostile on the OPs part. I will not hesitate to lock if that continues.

6 Likes

thats my point in your second last sentance the jump from 60-120 is significant. 120-240 theres no appreciable difference.
and i content this is purely down to lack of fresh packet data. if the packet data existed to make every frame original which is the ideal for maximum in game targeting accuracy. the difference should have been more dramatic at 240. but because they toped out the packet data at 120 there was no difference…
ergo 240 is pointless.

wow it sucks for me to be dyslexic…
any chance a game programmer can come and put me write if im wrong coz all i seem to be chatting with are guys who play games.

of all that i have no doubt. it looked nice for sure.
but thats not my point… he’s got a sponsorship of nvidia to do this testing then half arses it because he never asked a programmer how a game engine takes packet data and turns it into a frame of data.
with the end result his conclusion IMO is misleading.

Linus Tech Tips is not known for hyper analytical content. They do herpaderp fun tech stuff. That’s what they’re known for. If you don’t agree with their premise, find more credible sources.

1 Like

If that was true, there shouldn’t be any difference in result. But there is.

I get your point and despite not being a dev of any kind I can understand why you think the way you do. The thing is you are completely leaving out the human element. You can probably feel a smoother framerate above 120Hz but you probably can’t react faster to it at that point.

I don’t think you want a programmer in this video but a neurologist and maybe a physician.

1 Like

I say give me more Hertz until I no longer see a difference.
Now bring on 300hz.

For something like CS there is no relationship between frame time and packets. The packet only updates where enemy/friendly positions are. The rest is all local render. It would of course make a difference because of the clarity during movement as I said earlier. Server tick rate doesnt need to match frame rate in this case because the point is to make your end as clear as possible during your own movement. The advantage of course is going to vary from person to person. If you’re dogshit silver V then its probably not going to mean you get global. If you’re a competitive player making the big bucks as an awp guy… might make your flicks that much better.

As for Linus and his methodology… well…

1362941133929

Linus is a business man, not really a pro gamer. As a content creator hes really really good at what he does. So much so that you disagree with him and are now giving him advertising. Hes getting more clicks, and more views for his video and more eyes.

Hes making fat stacks here.

I’d say at the end of the day, even if hes wrong, hes still winning.

6 Likes

That’s not at all what i said.
I just said, that the difference between 144 and 240 is smaller than 60 and 144.

You are assuming. They did tests. However flawed they might be, i’m personally always going with real world data over “should, would and could”. I’m totally down to change my mind on this, but in overwatch i can clearly tell the difference in input between the game running at 100FPS or 300FPS. Note, not Refreshrate, just Frames my GPU is pushing out. I don’t care to much for packets. There is a noticeable difference in input lag.

You totally discregard the local rendering of the game. As has been stated time and time again, more fluid motion helps to predict movement patterns better and having higher FPS increases the chance of the rendered frame and a Server packet lining up.

If you feel that’s all wrong and mumbo jumbo, feel free to conduct a double blind test to prove us wrong. I’m all for experimental evidence and would gladly take part in what ever test we can come up with.

2 Likes

This is total conjecture with 0% verifiable anything on my end.

I did hear, once upon a time, that CS players will run frame rates way in excess of what they can display because even if the data is not seen it is still rendered and accounted for.

It was put forward as; if you move your mouse X distance when only rendering 60fps then it will divide that distance up into travel in game with its associated gaps from frame to frame. Now move the same X distance at 240+ FPS and you are still moving the same amount in the game but it is divided up into many more frames and thus smaller gaps between registering where the pointer is and will be more accurate as a result.

Not really sure what it adds to this here topic as I am not sure if it is true but this looks like a barrel of laughs so wanted to join in before it goes China syndrome and gets locked.

Edit:

This sounds like what I am saying too, though again it was a throw away comment in a video I watched years ago… Somewhere. So I cannot back any of that up.

yes, theres added input lag when a GPU is loaded up. I’m not hip on the science of why it happens but its definitely real and noticeable when your running at high frame times… beyond the frame time even. GN talked about it a little bit in the latest stadia video.

1 Like

Not going to hold my breath.

Still waiting for anyone to describe what specifically was wrong with the experiment.

4 Likes