Linus's 240hz claims. true or BS

Does it speak positively about Nvidia or G-Sync? That will bring out all sorts of creative antagonizing.

Didn’t you know monihors inhibit network performance?

1 Like

stop it

The tldr is that if you’re an amateur gamer, you will experience that largest benefit from higher refresh rates. Pros are able to adapt to varying degrees.

And side note that gpu input lag is real and likely a larger factor than monitor refresh.

1 Like

I believe that. My gaming experience changed substantially when I got 144hz. CS:GO and Rainbow Six: Siege, wise.

1 Like

How does even packets relate in this case to framerate? What’s being transmitted during gameplay are data and not entire frames so even a local 100Mbit connection is enough for a local multiplayer session without any bottlnecks due to bandwith.

I think you’re the one raving about Linus for no reason. You need to take a whole pack of chill pills dude.

I’ll give credit to him, he pulled out some numbers that can make sense and made an enterntaining video.

That sounds like a really crappy engine, might aswell tie movement speed to FPS…

??? not even talking about what you think i am :smiley:

this is exactly what im refering to you are remembering correctly.
in cs back in the day you could crank up the fps to 300 and you would get a reduced input lag and a better packet to fps ratio on 75hz crt screens.
300 is a multiple of 60 and 75 so if you had a faster monitor you could increase your fps and screen refresh. which allowed the game to effectively update at a quicker rate.
there was a similar thing with cod 4

you bump the packet data from 60 to 100 and match the fps and screen refresh. end result the hitbox data gets updated 30% more often as you jump from 60-100hz/fps.

with fortnight they tied in the packet count directly to the framerate and you could literally run at 300 packets(max the server would allow apparently) and gain all the benefits of a more regularly updated game.
it was such an advantage that it was immediately nerfed from what i heard.

2 Likes

if you have a game called call of duty 4
theres a couple of commands.
max packets
max fps
that you change the value on the max fps increases your allowed fps.
max packets allows you to enter a value to 0-100 and the packet data count is calculated between this and the max fps as a ratio… the ideal being 1 : 1.

theres also battlefield 3 where you can adjust the packet count via the ini.
but yeah any game you can adjust the packet count should be able to repeat linus’s results of dramatic diff from 60-120 and zero diff from 120 -240+
why such old games? easier to dig up this kind of info on… and no things havent changed since back in the day. :wink: so any game that allows you to change the packet count and test for yourselves.

1 Like

Wouldn’t it be much easier to conduct a test that doesn’t involve any Networking at all? If we’re doing what you say, we do the same test but introduce another variable, rather than simplify things. It’d be pretty hard to say if the “zero diff from 120-240+” is down to netcode or the Refreshrate not making a difference.

Also, if it’s only the netcode that’s holding 120Hz+ back in your opinion, wouldn’t this massively depend on game? They test Overwatch and CS:GO in the video. From This Discussion on Reddit they seem to have very different setting for interpolation delay. Yet we see similar results in the graphs in this test. Wouldn’t the higher delay in Overwatch mean that Even at Lower Refresh Rates there shouldn’t be a difference, while CS:GO would show this regression at relatively higher Refresh Rates?

I’m still not sure what all of this has to say though. In the end, they tested 5 people in real games and got some results. If you personally feel that it doesn’t make a difference to you, you are free to not buy higher refresh rate screens. I’m not sure though, how this invalidates the results. They samplesize might not be large enough to be representative, but the results are what they are. And you can clearly see the differences in the slow-motion capture.

1 Like

I think it would make a difference in 2 ways, even on a 128 tick server:
Reason 1, 128 tick server, In the case where you receive a server update (tick) a microsecond after a frame, you now have to wait for another frame before you see it, in the case of a 240hz monitor you see the update in half the time.
It’s wrong to assume the frames and server updates are synced.

Reason 2, client will feel smoother, the enemies may not update but you’ll feel smoother

All I get is move on BOOMER. I like freesync in a 120+ limit. But im slow and cant tell anymore. How about put a story back in games other than the same map over and over. I know farming WOW.

Compeditive gaming on PC is so micro focused. There is nothing but finger ligiment time to fail over age. So boring. I went to gaming for a story not a sport burn out where the players get chewed up and spat out by big money.

Tell me how these top players are doing in 10 years when the tiny hot ladies and the fam is gorn ?

240Hz does not make your single player game better ! A narrator about how awesome your are as a human would.

1 Like

Can’t we have both? I’m totally fine with spending 1 or 2 hours on a competitive shooter, getting into it, trying to be the best i can and then put that away and go back to some witcher 3 or such.
You don’t HAVE to do both but i feel like gaming can be either or. Having competitive games doesn’t exclude story driven ones from being made. Though i agree that there is a significant lack of games with great storys recently…

No because compeditive play is map locked and rule set. Its 100% boring and it is you vs other humans. If your good yes it works and you fall into the money making maching for compeditive gaming.

If you want entertainment and like playing tick tack toe then have at it for 100 years.

I like games with quests, stories, mystery and sex sceans…hay ho witcher 1.

I wasn’t saying both in the same game. Just games with story and others for competitive play.

For you. Having a short “burst” of actions that i can repeat quickly and often to micro improve is really fun for me. I enjoy Super Meatboy for that reason. I play Guitar Hero for that reason. The same for Overwatch, Counter Strike or Classic Quake. It’s highly enjoyable for me to spend weeks, months or in some cases years working on perfecting a narrow skillset.
It’s also not only about if you can make it to the global top 10 or such. It’s not about absolute skill, but about personal improvement. I’m a mid Gold Overwatch player, so about average. Yet i have fun improving on my level without having ambitions to go Pro with that.

I do too. As i tried to explain, there is space for both of those types of games. If you don’t feel like competitive play is for you, you don’t have to play those games. But just because you don’t like it doesn’t mean it’s boring to everyone else or shouldn’t exist.
I personally think the Dark Souls games are incredibly boring. I recognize though, that many people love them and think they are among the best ever made. They are just not for me. This doesn’t mean they shouldn’t exist…

1 Like

I respect your strive to get better in a game. I spent a decade MMOG’ing before I got tired of farming.

I had all the fun you do in the process…Im just old now :slight_smile:

1 Like

I watched the followup video he did, which involved different people, along with professional gamers. It’s a pretty detailed video, along with the testing methodology they used.

The general consensus was that going from 60 to 120Hz gives the biggest improvement, but there was also some gain going from 120 to 240Hz. The gains aren’t universal. Some people will have minor gain from it and others won’t.

This assumes that the game can produce frame rates that high. I don’t think the statement was about single player/multi player/online, etc…, but instead was a general statement about gaming.

You would be correct that if the game can’t give you frame rates above 120Hz, there should be absolutely no gain going from 120 - 240Hz on a monitor. However, there are plenty of games that are single player that EASILY give about 120 fps with a higher quality gaming rig.

I would recommend watching the followup video, which he produced SPECIFICALLY because many people questioned the first video. And like I said, the testing method they used for the 2nd video was IMO pretty detailed and it produced a bit of data to make any type of claims. IMO it’s one of the best videos Linus has made.

I also second the notion that someone else said, refrain from suggesting to people that you harass someone for a video. What you choose to do is your business, but it’s not good for the community to try to get others to harass a person because you didn’t like something.

1 Like

sorry matey im old enough to make up my own mind on how i interact with the community… your not a mod here and i already explained to them it was a joke THUS THE :smiley:
SO KINDLY WIND YOUR NECK IN!.

you all dont even have a basic understanding of what packet data is and what its used for. ort that if you can sync it with your framerate in an fps game you will be as accurate as the game will allow.

sure you can run more frames but they are always based on the packet data. of a previously rendered frame, the game doesn’t interpolate.
it decodes and passes the variables from the server into variables for the game engine and renders a new frame. this is it… the entire cycle…

if your running your game at 60 fps but 128 packets then your rendering 2 packets per frame and 1 packets worth of data will be dumped. which frames data is dumped depends on frame timing so can cause a desync of the gfx on screen and the none rendered gemonitrys like hit boxes and hit detecting to be off. which was displayed in the video.

then on to 120 with a 128 packet count (tick rate) and as i said they game is updating as often as it can and rendering on a per packet per frame basis giving the sweetspot.

then on to 240 frames 128 packets and no noticable increase in accuracy although things felt smoother. but this is where it gets contentious because as i said if the game has 128 packets the most original frames it can render is 128 the other 112 will be a double of an earlier frame.
basically the gpu is rendering 2 frames for half the time as 1 frame at 120
and linus claimed this gave him and every 1 else a definate edge but i fail to see how. because there was no more packet data to render the extra frames. if they could have upped there packet count to 240 then they would have seen a marked increase over both 60 and 120 it wouldnt have just been a felt 1 it would be an actual 1…
because more packet data means more original frames which means a more accurately updated screen. devs dont do this because it would mean your 240hz 240fps, 240packet gamer who can afford the best kit will be 400% more accurate than a 60hz 60fps 60 packet budget gamer.

so to me my point stands, they aint running games at 240 packets so theres no reason for 240hz because the games aint made to use it. at least in the way linus is pushing it… (btw please remember im not trying to sell you something you dont need because game devs aint developing for it)

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.