Sheesh?! Same as me.
You are so wrong you have no idea…
You need to remember that back in 2010-ish — when 144Hz first emerged — pretty-much every graphics card had a DVI port and the maximum refresh rate that a Dual-Link DVI cable could drive a 1920x1080 monitor was 144Hz.
The other thing to remember is that the movie Avatar got released just a year or so earlier and the 3D hype train was ramping up in earnest. Some monitor manufacturers thought that ‘the next big thing’ would be people buying 3D HD movies on Blue-ray and watching them on systems using something like Nvidia 3D Vision. Theatres screened at 24Hz. 3D (stereoscopic) Vision thus required an integer multiple of 48Hz. 144Hz is evenly divisible by 48Mhz.
Last, but not least, 144 was a bigger number than 120, so it got a massive thumbs-up from the marketing department because it could be successfully marketed as superior to simpletons.
So, 144Hz became a thing because a) it made maximum use of a common cable tech of the era, b) the numbers worked out nicely for consuming fad 3D movies on Blue-ray, and c) it had more marketing clout than 120Hz.
At least that’s the way I remember it…
Still doesn’t explain 240 HERTZ! That’s a shit-load of frames!
Problem with those 4.16ms frame times: Your nervous system is not able distiguish pulse lengths below 12ms. That just appears as continous ON, essentially 3 in 4 frames get wasted (when taking the massive lag of the brain out of the equation).
What I thought. Which is basically what i’m talking about. I want a monitor with the RIGHT features. Time and expenditures put into the right things. Not things that will just make the price go up unnecessarily. 120Hz, 1080p. 2 HDMI ports. I might connect a console. Like, PHUCK!
Depending on how big the monitor is or how close to it you are, upping the resolution to 1440p might make sense.
Multiple inputs make sense all the time.
1440p is worth it for desktop real estate alone, imo. 1080p is comparatively too cramped, after getting used to 1440p.
Depends on how far away from your screen you are (and the size of the screen). Assuming 24" screen, at 1m (= 3ft) away text may become hard to read.
i mean hey on the upside at least there wasn’t a collective jump backwards in panels for laptops and desktops, changing 1920x 1200 16:10 to 1920 x1080p 16:9 with the same amount of vertical inches losing 10% of pixels and at same vertical inches about 5% of display area AND gaining a massive bottom bezel, because 16:9 doesn’t work for laptops at all!
and good thing also, that we didn’t move backwards in higher end resolution from 2560x1600 30 inch displays, which we had over 10 years ago now to 2560x 1440p 27 inch displays, i mean that would be horrible right.
and good thing also, that we didn’t move from 75/85 hz to 60 hz max for no reason, when the transition from CRT to flatpanels happened…
overall point being, that the panel industry collectively shows users the middle finger, REGARDLESS! of what we actually want for over 10 years now!!
but hey at least u can pay a 400 euro/dollar gsync tax on high end monitors now…
now excuse me, i shall go back to crying over the fact, that acer and asus too most likely want to charge about 1.5k euros for a 42,5 inch 4k uhd 120 hz 16:9 monitor , wondering how i am supposed to get that much money together, for a display, that at least isn’t a resolution or hz downgrade from displays from OVER 10 YEARS AGO!!! (1440p is and 60 hz 4k would be compared to CRT era)
Well, alright. Say I want a 120Hz 3440 x 1440 display that has a simple, professional design.
What are my options. I found some nice LG displays that are around the $CDN 500 mark and 4K IPS. So giving up that much resolution, adding in double the refresh rate shouldn’t be too much of a price difference if i’m thinking about it logically. Any ideas?
Oh yeah, and usually they tend to be 34". WAY too big. That’s TV territory almost. High 20" range. 27-29. 29" ultrawide with 120Hz, Freesync would be gorgeous. Possibly IPS? And I don’t see why it should cost over $1000. Doesn’t look like there’s anything like that. All the displays i’m finding are way too big.
are u sure u aren’t wrongfully taking diagonal inch length as screen area in some wrong interpretation?
29" ultrawides would be tiny.
i mean my now quite old 24 inch 16:10 display may feel bigger, given that at 29" ultrawide u will probs feel quite squished vertically and thus work slower.
for comparison a 24" 16:10 display to a 29" ultrawide:
Sitting in front of one, can confirm. Need one mirror telescope per eye to see.
Five Glorious Inches of information real estate
I might just stick to 1440p Freesync. I don’t know about the refresh rate though since I am used to 60 Hz and 1440p 144 Hz is NOT happening on an RX 570, even 1440p 60 Hz isn’t happening on some games.
I think people need to not buy more than they need or think they need. Not that I am one to talk sometimes since I have extra laptops that I don’t use hardly (they costed me $150 or less anyways, not new laptops or anything). I once had a 4K 32" monitor, it was great but it proved to be a bit overwhelming and more than I needed. I might just stick to between 1080p and 1440p, or at least try to.
Eh, regular 1080p 60Hz monitors are actually dirt cheap right now. Even some ultrawide 1080p monitors and some 1440p monitors are fairly cheap, of course they aren’t top tier quality obviously.
Note to self: Freesync only exists, and companies only spent money developing the standard to make a new feature… that’s it, nothing else.
Having a free market means everything has a price. If people are willing to pay it then that is a price point. You can suggest people stop buying things that are too “Expensive” but some consumers have no choice to upgrade when things breaks and consider that “Expensive” is a sliding scale.
Way to miss his point though, his point was if your going to keep paying more and more for incremental improvements at best, it’s going to give the corporations who are already obsessed with profit the wrong idea by buying their product.
I mean I could totally upgrade from an RX 570 to an RTX 2080 or Radeon 7, it would most certainly be a large performance gain, but is it necessary for me, not really (I understand that this doesn’t apply to anyone, maybe someone serious about workstation applications would want the much more powerful GPU)
“It’s free real e-state!”
Holy shit… if I ever switch back to AMD… yeah, that’s pretty damned sweet.