Linus's 240hz claims. true or BS

so linus dropped a video where he claims 240hz is relevant
as it brings a tangible benefits to gaming.
i feel i must protest this as its blatant to any 1 who knows how games are written for online that his claims are BLATANTLY FALSE!.

all fps games use packet data to update the game during play. this is either in 2-4-8KB packets or a constant bitstream called ticks.
idealy when you play online you want 1 packet per 1 game update cycle (which includes hitbox placement and detection, along with location info for all players) and from this packet you want the game to render 1 frame.

so the ideal for players is 60 packets for 60 fps at 60hz.
or 100 packets for 100 fps at 100hz
up to 120.

after that there is no visible difference as the packet data isnt there for the game engine to render the extra 120 fps linus is raving about…
all he’s doing is displaying 2 identical frames for half the time of the original 1 frame.
thats it…

sure 240 packets at 240fps at 240hz would be sweet. but that would mean your game is updating x4 times more often than some 1 running at 60hz or x2 as often as some 1 running at 120hz, this would be a massive unfair advantage…
so, packet count is limited to 128 packets (or a multiple for ticks) max which sets a hard limit for original frames in any online game and that is YOU GUESSED IT…
128 fps…
sure your computer can render more but only 128 max of what ever fps your gpu renders will be original frames. the rest will be a double of an earlier frame.

now if im wrong (i dont think i am as i got all this info back in the cod 4 days where you could see packet data adjustments on the fly and there impact in games.) please feel free to prove it… if not please SPAM LINUS! :smiley:

(sorry if this seems a bit sporadic im dyslexic so my gammar aint the best)


I think the point is that the smoother motion makes you better able to anticipate movement, it’s nothing to do with network lag.

i.e., you get better local input that is sent over the network as a more accurate aim based on what your PC displayed.

It doesn’t matter so much what the network data rate is as (most)modern games trust the client a lot, otherwise they’d be a laggy shit-show. this is why aimbots, etc. are possible and why client checking is so important.

i.e., you move your mouse to aim, not ALL of that data is round-trip to the server, and the smoother that can be the better your brain can process it locally. the hit/miss is based on when you click the trigger, thats all that gets sent to the server (well, more than that obviously, but for the purposes of aiming and taking a shot)… lining up the shot is all processed locally.


I don’t think the difference between 120-240 is relevant IRL in any type of game, even if you could see easily the difference between the two I don’t think it would help a lot in a daily basis gaming scenario.

But, at lest on the Source SDK the docs about multiplayer networking(I can’t put link on my posts but if you search Source Engine Multiplayer Networking you’ll find the official docs for it) says that the engine do some basic packet and input interpolation up to a point, so if the server runs at 60 ticks and the client at 240 the client will do some basic interpolation for input and packet. I don’t know if this helps you to become a better gamer but for a game experience, yes

1 Like

Yeah its not interpolation i’m talking about. More so local aim tracking on the client before you actually get the network involved. e.g., the stuff you see as you swing your cursor around is at 240hz, the movement of enemies etc, may be at network tick rate update, but being able to smoothly aim at whatever at 240hz could be a benefit.

That said… i’m not a competitive online gamer (don’t care); i just do see how it could matter irrespective of what the network rate is…

The interpolation for network traffic would be a win on top of the local display helping :slight_smile:

1 Like

Were the tests local or on network? I thought they were on network, but I wasn’t paying that much attention.

Also, please avoid saying things like this.


yeah the game interpolates a packet count and bitrate to be a multiple of the framerate, and renders the appropriate amount of frames from that packet…

so if your running at 60 packets but 240fps your game will render 4 identical frames from every packet.
from 120 packets it would render 2 identical frames per packet.
so you would need 240 packets to take advantage of a 240hz screen is my point and that doesnt exist as it would upset the balance of online gaming to much.

im thinking this is also the reason the tech tubers have been saying for years they saw no diff 1s they got past 120hz. yup the packet data wasnt there to render the frames at 1:1 … so no perceivable difference.

Yeah, thin ice buddy.

1 Like

I suspect the input lag from the lower power GPUs would stack with the packet delay.

1 Like

wow chill out guys you see theres a smiley next to it IE NOT SERIOUSLY ASKING YOU ALL TO SPAM!..

No, at least on source games the game analyses the last two snapshots(ticks) and create the next movement based on that and the FPS. It isn’t identical frames, but newer frames based on the last two ticks.

I know but on a real life basis, at least for me, I don’t think the difference between 120 and 240 would be too huge adn I don’t play competitive too so even 120 isn’t too much helpful

1 Like

This is where i think you may be confused. Just because network updates happen at say 60hz, doesn’t mean that what you see and can aim at is updated at 60hz.

The other players will have positional updates (that are interpolated anyway) sent over the network at 60hz (before interpolation), but swinging your view around will be tracked at 240hz. And after interpolation on the client, the position of the other player(s) may be interpolated and displayed at a higher refresh rate.

If the game client knows that player X is moving in direction Y and speed Z, it doesn’t need 60hz updates to interpolate where the guy will be in between “key frames” sent over the network.


Here’s the thing though. They didn’t just think. They performed several tests and got results. If you think there was an issue with their methodology, you can/should bring that up, but I’m not sure why you would do that here when you can bring it up to them directly on LTT forum. We can not verify what they did or how they did it.

In any case, thought experiments and anecdotes do not yield reliable results.


there were probably [and are still] bitter heated debates about this when 120/144hz were introduced.

I’m not sure what anyone could do about it even if Linus was completely lying. Obviously OP doesn’t think 240hz is relevant, so that’s pretty much it.

1 Like

yep there methology was flawed the stated they ran the game at the max allowed tickrate for all tests which is is the equiv of 128 packets.
they got the best results at 120hz with an almost 1:1 frame to packet data ratio.
at 60hz they had what appeared to be input lag but it was desyncing the gfx from the hitbox data as the game was updating twice before a frame could be displayed.

at 240 hz they were still running at the max tickrate and no apparent difference other than it FELT smoother. the intangible felt…
but still linus felt he needed to say 240 is relevant when its jut not. at least not untill they increase packet count and tickrate.

I agree and I don’t care, I don’t game enough for it.

There’s a lot of studies on the subject that has better tests and results than the one Linus got and those studies left a lot of loose ends because theres a lot of things that can change the speed of how you see things(brightness levels, colors, amount of stuff on screen, concentration and more).

Their methodology didn’t proff anything tangible to a true study

1 Like

this is the same man who hooks up industrial air blowers to pc fan intakes, builds pc’s entirely from Wish, and watercools Minecraft servers.

So not exactly Gamers Nexus.

They controlled for all of that though. The monitors were all the same except for refresh rate. Only so much you can do about concentration. They did prevent Shroud from drinking coffee partway through though.

I still think there’s no real point in having this discussion here. No one who took part in the experiment is here to clarify or defend it.

If you guys want to contest the results, you should come up with a specific list of flaws in the methodology and present it to LTT on their forum.


Well, there was also a bottleneck in the tickrate of the server. CS:GO caps at 60hz.

Maybe a custom Quake server would do better. I would also like to see everybody at their peak. Like a great breakfast and performance enhancing “supplements”. (stuff like G-Fuel) I’m curious about the limits of human reaction time.

I’d love to see this test one more time on a 960hz microled monitor gsync with a 960hz server tickrate and once those exist, they’re the last monitors you’ll ever need. The framerate is an integer scale of 24, 30 and 60.

1 Like

Just @LinusTech

It is smoother. You can see that in the slowmo.