1440P or 4K?

I recently came into a bit of spare cash and want to treat myself to a monitor upgrade. At the moment I have an AOC 27" 1080P monitor and I think its time for more pixels.

At the moment an EVGA GTX 770 SC ACX 2gb is in my rig, but it WILL be getting replaced with either 970 sli, 980, 980ti or a 390X (possibly in xfire depending on the cost).

With that in mind is 4k actually worth it, or should I move to 1440p?  This will predominantly be for games like Elite Dangerous and Star Citizen, Dragon Age Inquisition and World of Tanks

these are the monitors I have been looking at - and dont forget when you point and laugh at the prices, I am in Australia and we have very limited and expensive choices.

and I have no interest in 3d <- if thats relevant when it comes to refresh rates

http://www.pccasegear.com/index.php?main_page=product_info&cPath=558_1213&products_id=29273

http://www.pccasegear.com/index.php?main_page=product_info&cPath=558_1213&products_id=27512

http://www.pccasegear.com/index.php?main_page=product_info&cPath=558_1213&products_id=29005

http://www.pccasegear.com/index.php?main_page=product_info&cPath=558_1213&products_id=27799

http://www.pccasegear.com/index.php?main_page=product_info&cPath=558_1213&products_id=28626

http://www.pccasegear.com/index.php?main_page=product_info&cPath=558_1213&products_id=27262

i'd grab that ASUS 4K model but i'd recommend you get it from Amazon. its SOO much cheaper than that link you posted.

http://www.amazon.com/gp/product/B00KJGY3TO/ref=s9_simh_gw_p147_d0_i1?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=center-2&pf_rd_r=1VM3YRKEZJ2N9TV1GW0V&pf_rd_t=101&pf_rd_p=1688200382&pf_rd_i=507846

Amazon will not ship to Australia.  At least for me anyway.

I had a 4k monitor but sold it and bought a 1440p monitor instead.  I think in 2 years or so I'll buy a true 4k monitor, but for now the hassle of being on the bleeding edge isn't worth it.  In a couple of years a middling GPU will have no issues driving one, and scaling issues will be taken care of.

I'm very, very happy with my 1440p monitor.  

Forget 4k, it's premature.  Will be for 5 years.

1440p is only just coming into it's own, only in the last year or so has 1440p gaming been viable on a single graphics card.  Basically 1440p is where 1080p was 5 years ago.  Kinda thing like that.

I guarantee in 2 years (longshot but fuk it) 21:9 3440x1440p.  Nobody is going to give 2 shits about 16:9 4k.  Bar productivity.

https://www.youtube.com/watch?v=Rf7FcIgkgDY

https://www.youtube.com/watch?v=p3ut4hj85Hg

If Linus can't convince you no one can.

https://www.youtube.com/watch?v=KnrxNfxRK_4

You have to keep in mind that 4k is still very much in infancy. I would suggest not getting it for that reason alone. You'd regret buying a half-decent 4k monitor now only to see much more accessible, affordable ones pop up next year. I would, at least.

 

The BenQ monitor is fantastic, though personally 32 inches is a little big for 1440p. If you're going to 9xx SLI, get the AOC 1440p 21:9. I have a 1080p 21:9 (Dell u2913wm), and it's the best monitor purchase I've gone through. 21:9 has problems with some games, but more or less it's a fantastic experience. 

If you don't want ultrawide, check out Overlord's Tempest x270oc. It's 27 inches, which is personally my favorite size for 1440p, a good IPS panel, and able to "overclock" to 120hz out of the box easily. I don't know if you can find it where you live, but it's a fucking steal over here. 

Consider me convinced!

Ok so the AOC 21:9 monitor it shall be!

Thanks for all the advice everyone!

ooh, read around some reviews, actually lots of reviews.  Those AOC 21:9 monitors have disappeared from the NZ market completely.  Kinda like everything Msi until a new gpu launches.

So I have to wonder about quality assurance and after sales support with some products that just vanish.

I have a feeling it would be to do with after sale support and the importers had a guts full.

This would be my pick for a cheap 21:9.  Considering the price point of the AOC with it having multiple inputs means there is your standard set of circuitry, which may or may not affect input latency.  But the cost had to be cut somewhere.

https://www.youtube.com/watch?v=DuXA_qvkH2c

https://teksyndicate.com/videos/korean-219-montitor-crossover-290m-29-lg-ips-ultra-wide

Maybe Logan can do a follow up on it.

 

It must be the No Frills model of this.

https://www.youtube.com/watch?v=s0i7FjwteQE&list=PLliKax3vGO6FUKCh22UFzQV9wBRDcy-v4

Hear me out, dont buy into 2k as 4k is coming out at the same time. Its the next leap same as when 720p came out. 1080p came out too, just wasnt super cheap. So buy into 4k now or wait a year or two. Dont go 2k. I am going 4k as when I bought my first 27in 1080p (non led) back in 2007, it was about 400-500. It was beautiful still is. So when you argue 2k vs 4k. Think of it as 720p vs 1080p as it is the same argument, just evolved from 10 years ago. 10 years from now it will be the argument between 8k and 16k (16 being supposedly perfect at any distance, according to articles I have read)

I would honestly wait for now. The 770 and 1080p should be able to hold you decently well. Here is why I want to wait for now (being in a similar position to you).

4k is the new standard. It is going to be the new 1080p, imo. It is the next big step. I think that 1440p is just going to get overlooked by the public in general. 4k is what tvs are going to, which means that is what shows and movies are going to be going to which means that is what gaming is going to keep to mostly (of course games will still support most any resolution). And 4k is just now getting here which means you are going to spend a bit more than necessary. Prices are dropping pretty fast.

Adaptive sync is just around the corner. With a higher resolution like either 1440 or 4k, the benefits of adaptive sync are really necessary, imo. They will keep you from having to push 60fps for ultimate enjoyment. The first 4k monitors with adaptive sync are going to be coming out next year. Samsung said that it will release its first ones in March, but I have heard that Seiki is going to release some in early 2015 (Jan, Feb).

On top of both of those things, gpus are still making huge improvements with every cycle. The 20nm and 16nm processes are going to start showing their head in 2015 in gpus, and 3D memory is coming as well. This all spells a potentially great 4k gpu in the 390x as well as nVidia's next flagship (which might not come until late 2015/early 2016).

So yeah, as hard as it is, try to just hold off. Though if you really want to, it is your choice, that is just my advice.

The monitors out now are true 4k, just the price of the ones that are 550ish, are going to go down to about 300. IPS will be an option, which is what you might think true 4k is. It is in a sense of truer to life color reproduction. However, I have a 27in TN LED (Hanns G; I replaced with my first 27in from 2007 this past spring for $160) and a 27in IPS (ASUS; which is about $270 right now) side by side. Taking into account the Hanns G replaced my first one which was also Hanns G for 7 years ago. It look almost identical, all three. However the one from 2007 has a 4 or 5ms response time vs 1 on my newer G and 2 on my ASUS one. I am going to be selling my Hanns G one for about 100 to help get me my next monitor. The ASUS 4k one. Anyways I should be able to play all my game in 4k even with a 280x. I might note be able to play Crysis or anything, even BF4 or 3 on high, but ill play in 4k regardless for a year with dumbed down graphics. I also plan on replaying most of my wishlist games that I havent yet which are mostly pre 2009. Which should be playable on 4k. Games like Dawn of War, Command and Conquer, etc. The space from 1080p vs 4k is even worth it simply for office use. I mean a 27in 4k is better than 2 27in 1080p for practical desktop allowance. You have more space in real life and computer desktop space. More of a canvas area for things like photoshop or the fact 4k monitors allow input from two simultaneous sources (not referring to pip). I am referring to having two computers on one screen. The application for higher pixels is practically endless.

I have a question, why do you think die size matters? I mean shrinking the die size limits the usable space for new technologies correct? Or am I wrong? I am just thinking back in 2007- 10 I was using 45nm die and now I am on 32. Soon smaller. 

I am trying to think this out. Imagine shrinking playing jacks, allowing for more jacks to be collected? As in the less space they take up the more space can be allocated for more of them. Now think of the proccesors on a GPU die or CPU cores for that matter. You are shrinking the cores simultaneously shrinking the die they sit on correct? So instead of staying on 32nm and making 16 or more cored cpus, they are going to stay probably around 2 - 8 core forever and just continue to shrink the die and technology? Another issue to this is freaking heat dispensation. I mean shrink the internals and the surface it lies on will basically mean heat from one component is touching another component of the part.

20nm and 16 nm that I mentioned in my comment above are not referring to the die size. They are referring to the manufacturing process. A die shrink (not the size of the die, but the capacitors) increases efficiency and speed. Haswell is 22nm, current gpus are 28nm, the next nVidia seris is supposed o be 16nm and the next AMD series is supposed to be 20nm (though with the 3d memory).

My hesitation with 4k is if it can be driven without having to spend thousands on a multi gpu setup.  I would be happy to spend the cash on a 980 and then grab a 2nd card a month or two later on, but if that is still going to struggle then whats the point?

Thats why I have said to buy now or wait. Now if you are willing to have dumbed down graphics for a year or two. I am going to keep one of my 1080p 27in monitors to if there are games I have trouble with. But my question for you is how much time do you find playing older games? I find myself still playing games from 2004ish. I mean CSS and CSGO are still pretty much the same game and I would guess I could play both at 4k with maybe not the 200+ fps I get on 1080p but should be between 30 and 60 (playable, especially for 99% of monitors supporting only 60fps). So achieving more than 60 fps is really important if a. you want to show your fps and not see more than 60fps on the monitor b. show off the fps and have a 144hz or higher display. From what I have heard 4k 120hz+ wont be around for a while, so achieving more than 60fps is kind of pointless for a few years. I probably wouldnt even get 20fps in crysis, but as I said I have as 1080p monitor for that still. Just all about the more screen space allowed by going from 1080p to 4k. 

Aside from LOTRO I dont play old games.  Also I have no interest in 3d or 144hz - as long as im up around 60fps thats all I care about.

really this upgrade is going to be for elite dangerous, star citizen, DA Inquisition and The Witcher 3

The other thing I am not sure about is - if a game is too taxing to run at 4k, how would it look if i manually set it to 1440p or even 1080p?  If its not going to look weird then that would be fine.

As pointed out by Wendell on an episode of the Tek a while back, 4k is literally a 4:1 ratio with 1080p. Meaning that each pixel on 1080p becomes a 4 pixel square at 4k, so it should stretch perfectly. Theoretically, a 1080p game on a 28" 4k monitor should look the same as a 1080p game on a 1080p 28" monitor.... more or less.

Upscaling has been something I've looked into, were your playing at 2k or 1080p output from the video card. It wouldnt look much different if you play 720p on a 1080p monitor. From what I can imagine atleast.