Chicken or the egg.. or.. rather monitor or the GPU?

Howdy… so please forgive the long winded question…

TLDR: Trying to decide on buying a monitor now, or snagging a new GPU first…

What I have now: 3080ti, Dell/AW 34" IPS from about 4 years ago (CPU is a 9800x3D)

What I really want: A “5k” 34" ultrawide (but this doesn’t seem to exist).

What I do: WoW, Eve, NMS, and also office work on some days when I’m WFH (monitor will be shared between my PC and office MacBook)

All that said… so I’ve been looking at the RTX 5080, and as underwhelming as that is, it still seems to be the “best” option for me, as a 5090 would be overkill, nevermind the heat and power issues. But the monitor I have is capped at 100hz, so getting just the GPU would be silly, as I’m not gonna gain much with that cap.

So I’m looking at monitors as well, and waffling between the OLED 32" 4k screens, or the OLED 34 Ultrawides, but, those don’t really improve over my current resolution. There also doesn’t seem to be a 38" ultrawide that would gain me much either.

I’d consider IPS, but there doesn’t seem to be a good gaming IPS in the 32-24" range right now… (Acer announced this, but it’s Q3 at earliest, and who knows what the cost would be)

I’m also somewhat concerned about getting the 4k screen first, and having my 3080ti push that plus my second monitor, and I can’t find reliable info on if that would be pushing too far.

I’d consider the 9070 xt, when and if that’s ever released (and may not find a 5080 before that anyway), but I’m rapidly loosing any confidence in AMD not screwing up that launch even more.

Do I sound confused yet?

Thoughts?

My take on the matter is display first because you’re looking at it for way longer than you play games on it.
GPU can wait, there’s always compromise that can be made on the game details to get better performance on higher end displays. I’d also say that a 3080ti is more than capable of playing games at 4k and high details.

2 Likes

Unless your gpu is making you lower settings on multiple games just wait. If there is a monitor you like at a price that you also like then I would be targeting that first imo

I’d go monitor first. A 3080ti should be perfectly capable of playing games, especially the ones you listed, at 4k. Will it do it at the full 240 hz on cyberpunk, no, but the 3080 was essentially the top tier (because 90 was barely faster) when 4k was going mainstream.

And if nothing else, getting past 100hz just on the desktop is likely going to be a very noticeable upgrade. I was slightly worried about text clarity given the reviews of OLEDs focusing so much on it, but I’ve found that to be an absolute non-issue.

1 Like

Thanks for the feedback. Yeah, unless I have a 5080 drop into my lap at MSRP, probably does make sense to wait it out, and see what the fall monitors bring. There are rumors of 5k 34" screens, but those are probably 2026 screens.

I guess I’m in the minority here, but Id say GPU first. If you really want a 5K 34" ultrawide and it doesn’t exist on the market yet, dont buy something you didnt really want instead. Just wait till what you actually want comes out. With the new HDMI just released, monitors will be coming at the end of this year and through next year that are either resolution upgrades, refresh rate upgrade, or a combo of both now that they can send more bandwidth down the display cable.

Your current monitor may be capped at 100hz and so the new GPU wouldnt benefit you from extra FPS, but it will bring your minimum frames up which will make things smoother still. And the GPU you want is out “now” (once they come in stock at least), when the monitor isnt out now. So seems a clear choice to me.

That’s fair too. Guess it’s just going to be a waiting game. The 50 series launch is such a mess I’m not going near a MicroCenter for a month… :smiley:

you won’t notice a difference at 4k, its pretty much only for E-peen and making your GPU run twice as slow

ultrawides doesn’t really do it for me either, I’d stick with 1440p

but that’s just my opinion

I use decade old 1440p shimians that can only use dual link dvi which is somewhat of a pain

If you don’t have HDR, get a decent HDR display.

I found HDR content to be a more satisfying improvement than going from say 1080p or 1440p to 4k.

My two cents:

GPU: From my experience: if you want to play games at 4K with raytracing, frame gen and dlss upscaling a RTX 4080 or 4080 Super still works fine and might be way cheaper than a 5080 while offering similar performance. I played through Cyberpunk; Alan Wake 2, Elden Ring with Raytracing and my flat mate is currently playing Hogwarts Legacy on the same rig at max settings (with DLSS of course). IMO only the xx90 flagship received a big bump this generation. Our HTPC is a Intel i7 10700K with PCIe v3 mainboard I salvaged from an HP Omen 25L and it still handles any game fine I throw on there. I have the 3080ti in my personal rig. Works fine for raster but struggles with the current raytracing generation of games in anything above 1440p.

Screen: Coming from a 32 inch HP Z32 IPS I think your next upgrade would be an OLED or a HDR capable screen in the 40 inch range. Please be aware that you need to do some preparatory measures for OLED based displays (sleep setting for display, autohiding taskbar, maybe black background). A TV works fine too. My friend uses a 41 inch LG C4 TV as display and if your desk has enough depth its is perfectly fine. But be aware regarding possible privacy issues with TVs. LG is screenshoting your content and sending the hash data for content matching (often blown out of proportion, but still an issue IMO and manufacturer can change what is transmitted on a whim) - you can circumvent this by never connecting the TV to the Internet.

I never liked ultra-widescreens. You either get the lower half of a big TV, or 1 and a half screen. With common formats like 16:10 or 16:9 you simply don’t have to fight with games or application which don’t support ultra widescreen resolution.

To supplement @Dratatoo warning about OLED migration prep (from perspective of current LG 42C2 oled owner):

OLED panels are absolute and unassailable winners in areas of motion clarity, response time and real word contrast, but they are not strictly superior in all areas of performance.

Some of the drawbacks are non-obvious, as they were never present in normal panels.

  • ABL as in automatic brightness levels - OLED display cannot sustain maximum or even common levels of brightness over large area indefinitely. This is non starter and requires unpleasant workflow adjustment
    • human eyes are very sensitive to even small brightness changes, so your panel dimming as you open and resize MS word window is jarring as f***
    • ergo dark mode everywhere all the time by neccesity
    • ABL will fuck with you even in SDR mode and low brightness (as in 20%)
  • Extreme pixel response and low persistence times means that flicker-free operation is much harder to achieve.
  • non-standard or non-uniform panel matrix, especially if using oled tv
    • might lead to text blurriness or color fringing, since cleartype like technologies are not designed to handle this kind of subpixels and likely never will
  • qd-oled multi layer approach has drawbacks to effective contrast ratio in environments with ambient lighting.
  • limited pixel lifespan and uniformity impact of driven too hard, too long
    • can be mitigated by SDR mode and low manual brightness
    • likely be mitigated by newer more efficient lumifor chemistries and tandem panels
  • HDR mode is nearly useless and actively unpleasant to use in for desktop use
    • close to zero productivity apps an OSes are mastered for HDR use
    • high brightness is eye searingly unpleasant
  • if using tv you will have to deal with HDMI shenanigans and learn to hate it with passion
    • you will also learn to hate YUV with sub 4:4:4 chroma sampling (it really lovely to get grey instead of black thats to color compression, yay …)
    • you will learn there is no integrated gpu capable of outputting 4k 60+hz rbg signal over hdmi
    • forget about geting 4k rgb at 120hz with dedicated amd gpus on linux as well, thanks to patents
    • DP 1.4 → HDMI 2.1 active converters work, but are not reliable

I bought oled for high contrast and motion clatity. Extreme aggressiveness of abl was surprising, even in optimized setup with SDR on and manually lowered brightness.

HDR mode up this close suffers even worse from ABL and its bad enough on my eyes it doesn’t matter.

ABL however is panel specific and its impact will be reduces over time as oled panel tech improves.
We have new lumifor chemistries in the production pipeline as well as sandwich tandem oled panels.
These panel will likely be able to operate without ABL on SDR content at lower brightness level, making them ideal for normal desktop work.

Upcoming LG C5 series might have tandem panels, no idea about lumifors. Until then there is no reason to upgrade for me.

1 Like

So regarding Ultrawides…

The games I’m playing (WoW, EVE, NMS) all work fine with it, and for productivity, I like being able to drag stuff out across the screen, having a browser and editor and such all sort of available. I do notice the lack of height sometimes, but I know if I go to a 32" I’ll miss the width as well.

I think I’m at the point where I’ll bite on the first compelling display that checks enough boxes. That 32" 5k Acer might do it, but that’s not until Dec probably. Not sure if there’s anything coming in the 34" range besides more 240hz refreshes of the same 3440x1440 screens.

Since I use the display for productivity/office work as well, I’m loathe to get anything larger than the 34" that’s still 3440x1440 because text clarity.

I suppose I should look again at 38" screens, but I’m probably waiting for model refreshes there as well.

Yeah, all those OLED concerns have me a bit worried… I’m WFH generally 1 full day a week and other scattered hours, so I have lots of text, desktop stuff, etc. in the same place for multiple hours when I’m doing that.

Yeah, TV is a non-starter, as I’m doing WFH on it as well, for… a gov affiliated company.

I’ve done some measuring, and really a 38" is the biggest that I could really do. I think my eyes would have issues going larger as well.

It sounds like we’re in a similar setup: I have two 34" AW 3440 x 1440 displays (AW3418DW and AW3420DW) with an RTX 3080 and 9800X3D.

I really enjoy 21:9 aspect ratio for gaming and productivity tasks, and the 34" size is large but not obscene for my desk. I’m happy with this, and as you said I don’t believe you’ll find many qualitative upgrades if you want to stick with this size and form factor.

A possible contender would be this, but it lacks OLED and is a reduction in refresh rate over my current displays: https://www.newegg.com/lg-34bk95u-w-34/p/0JC-000D-006A8

The price is also roughly double what the latest 34" AW displays have been going on sale for. To me there’s too many compromises for too much money to justify upgrading, so I’m saving my pennies for a GPU upgrade in the next 12-18 months once inventory levels and pricing settle.

Yep, AW3418DW represent! :smiley: (I have a vertical 27" Dell next to it for Discord/Youtube, etc.)

There are rumors of 5k versions of the 34"s, but those probably won’t even be announced until CES next year, and then only shipping well into 2026.

It’s a tough space, because the market seems to have gone to 32’s, or to massive ultrawides.

Yeah I don’t see 21:9 or wider becoming ‘mass market’ so in general look at your 16:9/16:10 market cutting edge to see where 21:9 may end up in the next year or so. I would go back to 16:9/16:10 for the right display, but in no rush.

1 Like

I had to go tv route due to being cheaper and not wanting ultrawide., 4k 16:9 42’ is optimum for me, and there a next to no oled displays with those parameters.

I don’t know if I am willing buy tv model again, but since newer lg monitors will come with weboss crapware too, I dont think it matters anymore.

TVs ABL is theoretically controllable with service remote, if you don’t mind losing warranty and potential negative lifespan impact of doing so.

Might be worth doing on my unit, since warranty is gone and I don’t use HDR and high brightness anyway.

This is just my opinion, but rewarding Nvidia with my purchase of the 5080 seems unthinkable to me. That product was implicitly designed to under-perform, and under-deliver so the company can maintain its dominance in the market. Giving you as little performance per dollar ensures that the minimum performance gains they deliver with the next generation are as little as they can be. In short, Nvidia is sand-bagging their GPU’s.

Meanwhile in the monitor market we have more significant competition, and variety of consumer choice. There are some fantastic models to try out if you haven’t upgraded in the last few years. You’re getting a lot more for your money by upgrading your monitor instead of your GPU.

Personally I suggest you getting a high refresh rate gaming monitor. Have you played at > 240hz? It’s a gamechanger.

2 Likes

I get that… Nvidia is dragging their feet on updates for sure. But I blame AMD there in large part for A) not being able to compete and B) effing up their launch so bad that they look worse than Intel.

But at the same time, what’s the other option? Buy a 4090 for $1800+? It’s unfortunate, but the market is what it is, and it’s going to be that for another 2 years.

If the right monitor was available now, that would be an easy choice, but the larger IPS screens that I’d prefer either aren’t available or are very long in the tooth. Hopefully that Acer monitor might be a solve, if there isn’t a compelling 34" monitor late this year.