Dig around in your monitor settings to see what’s there.
I am suspecting that it accepts 4K and downscales it. That way it can display 4K content at 1440p, which is better detail than accepting 1080p and upscaling to 1440p.
That’s probably a good idea for video like Blurays but not a good idea for games.
But as I said, this is just a suspicion and I don’t know for sure.
Scaling from one resolution to another us nothing new… For a long time there have been both DSR and VSR, and just because your monitor isn’t showing 4K doesn’t mean it can’t get 4K signal and squish it down to 1440p.
I dont understand downscaling. The monitor actually has an array of 2560 x 1440 PIXELS. So what the hell does it do to display 3840 x 2160? I t makes no sense to me.
It does not display 3840x2160… It still only displays 2560x1440…
It has 3 pixels of information, but it can display only 2 pixels so it recalculates the colors. It displays the result.
It interpolates the colour being sent as it now has to display 3840 horizontal individual pixels of colour per line it has to convert that same range of colour now in 1440 pixels so it is mixed and aim for the best between all of them and approximates that colour for that pixel.
It’s recalculating colors. It’s simple… Blue plus yellow equals green, although in monitors it’s a different result, but still, it’s calculating colors. You have 2 blue pixel, the result will be a blue pixel. You have blue and white pixel, the result is light blue pixel, and so on and so forth…
As to WHY you’d want to do that - rendering at higher resolution and then doing re-sampling you can get higher quality output than rendering at lower res internally and then displaying the result.
Kinda sorta like the way anti aliasing works, but with additional texture detail, etc. rather than anti-aliasing just basically (somewhat intelligently, but still) “blurring” edges to make the edges look smoother.
Anti-aliasing throws away colour detail to some degree, full screen re-sampling does the same thing, but you’re dealing with a lot more data before throwing away a little in the re-sampling process.
There’s obviously a gpu processing cost, but if you have more GPU power than screen resolution/refresh rate maybe it’s worth it.
e.g., maybe you have a GPU capable of solid 1440p frame-rates on a 1080p display…