Yeah depends on the use case. For me (and probably many gamers out there), it is superior in every other way to any monitor I am currently looking at.
Na it was perfectly valid. I just had to bring up the $5000 waste of money that Nvidia is peddling
Except in price, performance, VRR capability, possible chroma issues, native resolution, and form factor.
Iām not saying its a shit monitor but its far from a valid replacement for everyone like youāre making it out to be.
Well it is cheaper than the top end monitors, has the same or better performance, and shouldnāt have any chroma issues (its last years model did not).
I donāt know what you mean by native resolution, since it is 4k, but yeah it is a big form factor for PC.
you cant really use 4k in gaming at 120hz. without VRR youāre playing 1080p. Its not really cheaper than top end monitors. I dont agree that it has the same or better performance. If youāre spending 1800 on a monitor youāre doing it wrong, because theres tons of great options well below half that price. Chroma issues can still pop up depending on signal chain, theres more to it than just the tv supporting it, but if you say theres no issues, Iāll take your word for it. Theres also the fact that people might not have the real estate on their desk to fit the thing.
I will agree to disagree and leave the thread.
Barnacules has 3 TVs set up for his computer. Heās actually what inspired me to look at using TVs as monitors. Lol. I think he used oled TVs as well.
isnt known for making the best decisions in the world.
None of this matters because VR is going to replace everything!
Full Disclosure: Doesnāt own a headset, and hasnāt tried it since those glorified tech demos in the arcades decades ago.
Seriously though looking at TVs is probably not that bad of an option if you really want that size for whatever reason.
I should have mentioned that I was more so confused at your statement because you didnāt mention any points in detail, rather than trying to argue with you.
Well yes you canāt use 4k/120hz that well right now. However buying TVs for features you can use now, means you will be upgrading more regularly and probably spending more money in the long run.
For 4k, I donāt think there are really good options in the monitor space. Everything up till the stuff coming out this year have significant compromises. There are lots of good options at 1440p, but like I said above, it is a bit of a waste to buy something obsolete, just cause you canāt technically make full use of the latest tech.
What chroma issues can pop up? I assume you mean running it through an older receiver? It doesnāt look like there should be issues on HDMI 2.0b, but I could be wrong.
Yes, but I rather wait for VR to be at a good place while I am using a 4k OLED monstrocity, than my old 120hz TN panel
Have you put any thought into what youād need to push 4k beyond 60 FPS in any decently modern Game?
Even with a 2080ti your looking at 60-90 FPS at best in modern games at best. With a 2080 youāll be hard pressed to reach 60 even at medium Settings.
What Rig are you planning to run to even approach 120 FPS? And what will you do when you canāt hit 120 and donāt have any adaptive sync? Tearing or dropping to 60/30 Frames?
Oops, for some reason I assumed it did have VRR with the way he was comparing it to gaming monitors.
I seem to remember seeing a video where a 4k TV was fed by 4 separate GPUs, but not only was it seriously a headache to get working, but bloody expensive to boot. Itās like tripping over a $100 to grab a penny. Letās get a cheaper monitor (TV) but get 4 x 2080tiās!!! Actually, they might have been using workstation GPUāsā¦ I canāt remember what finally worked for them. Thatās also ignoring what type of MB youād need for 4 x 16x PCIe and the fact that this TV probably doesnāt have the 4 x Display Port connections, etc.
Iām pretty sure this was the LTT VIdeo and the TV was 8k. In that case you actually need multiple connections as no Interface on a GPU Support 8k60.
Or, it was one of the 40" 4k Monitors with 4 Inputs? Those are great but spanning across the 4 inputs can lead to them being slightly out of sync, which is worse then tearing.
In any case, SLI/Crossfire is dead and even the highest end GPUās available to day arenāt able to consistently deliver more than 60 FPS at 4k. So whatās the point? Iām fine with 1440p 144Hz and will be for a considerable amount of time. As said, the next āreasonableā upgrade for me would be micro-LED. Iām still running a 42" LCD as my TV at 1080p. 90% of contenct isnāt 4k or HDR (let alone both) and OLED has itās issues. Iām not going to spend 1k on a new TV thatās still LCD just to see the same amount of Pixels on a larger screen, effectlively reducing picture quality over my 8 years old TVā¦
Ohhh, no, youāre right that was 8k. My mind is mush.
lol, likewise, like I said before I hardly use the thing as is. No need to upgrade off 1080 there.
Personally I havenāt even upgraded from 1080 here either as I honestly donāt play many new triple A titles. Once I upgrade my GPU Iāll consider bumping to 1440p but 4k is a long way off. I mean can you imagine playing a Dosbox game on 4k? Upscaled 640x480? (if it would even support that resolution).
Yeah, i was looking at a new Monitor and at 27", you can actually see the difference between 1440p and 1080p. Plus you can use 1440 at 100% scaling, which isnāt true for 4k. And windows scaling is trash!
Since i was upgrading anyways, 1440p And 144Hz seemed reasonable (both in terms of quality, power needed and cost). Itās certainly not needed, but weāre at a point where i wouldnāt personally buy 1080p as a primary Computer Screen. If you have one, itās perfectly fine, i just wouldnāt buy it new now. 4k only gets relevant at 30" and above at Computer viewing distances.
Yeah Iām waiting to see what AMD does with Navi before I do that. I mean we are just starting to see the non-reference designs and they havenāt pushed out a flagship Navi GPU yet either.
With how old Iām getting I may never get a 4k for my PC. Maybe if I get real close and squint I can tell the difference
Having a DP as input would make things easier as most GPUs have 1 HDMI and multiple DP.
I do really like 4k at 150% scaling on a 27" monitor. Everything is the same size as 1440p on a 27" monitor but the text is much sharper. Unfortunately many games donāt scale in windowed fullscreen mode, which is my preference, and the only really capable 4k gaming GPU is $1200.
You would think you could just game at 1080p pixel doubled, but unfortunately GPU drivers and monitors donāt support integer-ratio scaling for perfect-pixel images, instead they use bilinear interpolation which looks blurry. Integer scaling will become increasingly important to make old games look right as we move to higher resolutions. 2D games look particularly bad without it.
Edit: Actually I just checked and Intel (of all people) supports integer scaling in their GPUs now. AMD and Nvidia still donāt.
I have a 1080ti.
I mainly am just using this particular TV as an example to show the state of monitors vs TVs, but I am considering getting one.
However I feel that the notion that getting something you cant push (yet) is a fallacy. If you only buy monitors that you can fully use right away, you will be upgrading more often and probably spending more money in the long run.
Nothing can run this TV at 4k 120 because only the new X box has HDMI 2.1 However when the next round of video cards come out, I would hope they have HDMI 2.1 and can also push closer to 100fps.
I have been running 4k since DSR came out and I was using a GTX 780. At the time newer games like ARk couldnāt play 4k, but ultimately DSR is better than anti aliasing for a similar performance hit (usually). You just wonāt hit 4k 120 often. It really isnāt until you try to run 4k with full AA that you start really having issues.