GPU Upgrade? Or not? Or something more? + CPU/Intel Discussion

Technological progress. The ability to manufacture CPUs with a given performance gets cheaper over time.

AS an extreme example, the CPU in my Apple Pencil is both a lot faster and a lot cheaper than the system performance of my Amiga 500 i had in 1992.

  • 32mhz vs. 7mhz
  • multiplies, etc. in 1 clock cycle vs. several for the Amiga’s 68k
  • speculative execution, etc. vs the 68k which isn’t
  • full 32 bit vs. 32/16 bit
1 Like

Let me chime in as well. For 1440p a GPU upgrade will be a good idea. Especially, since you can’t properly scale down to 1080p. As others have mentioned, I’d got with a 5700(XT), probably Gigabyte’s.
While Raytracing is awesome and all, it seems to me that you at least need a 2080(ti) to really enjoy this feature, so AMD should be fine.
Picking a card that shall last an entire generation is a far stretch. Games will be heavily optimised for consoles, while GPUs in use nowadays will be almost forgotten at the end of the new consoles’ cycle.

I think CPU wise you should be fine. That being said, your frametimes will be better with one of them new Ryzen CPUs. Decide for yourself which one you want. From an averages perspective, there shouldn’t be that much of a difference at 1440p.

Concerning Intel: I doubt they will do anything vastly different in the next generations. Intel’s solution will be to throw money at the problem and buy market share through that. They can afford it :wink:

2 Likes

Wait what? What do you mean by you can’t properly scale down to 1080p? I’m just saying, if a game is too demanding to run at 1440p on my new monitor, I figured going down to 1080p might be a good idea. It’s not like the aspect ratio is out of whack. It’s still 16:9. What’s wrong with that? I think a GPU like a 2070 Super would last an entire generation of consoles, no? If a GPU has 50% or so more power than the consoles, it’s pretty solid. Playing on my 960 has overall been very nice, at least at almost 1080p (1050p(?). If let’s say, we know the 2070Super has 100% more performance than the consoles, that speaks for itself. Sounds like a winner to me.

And on a random note… Holy SHIT. I’m still amazed by how Skyrim Special Edition’s GPU requirements are vastly different from the original version. A GTX 780. Holy shit…

And my 960 is over 3x as fast as a GTX 260; which was the original Skyrim GPU requirement.

Well if you still play allot of cpu demending titles.
Then i would still consider to stick with intel.

The aspect ratio is fine, but this has nothing to do with it. Scaling 2160p to 1080p works fine because 2160/1080 = 2. Which means you will use 4 pixel for 1 pixel. This works without any issues. But 1440/1080 = 1/3. So you are trying to use 2/3 pixel for 1 which doesn’t work since you cannot use .66 of a pixel(, nor .5 pixel). For 1440 you’d have to use 720p. When I purchased my new monitor, which has a resolution of 5120x1440, it arrived before my new GPU did. My old one was unable to handle the resolution so I picked 3840x1080. The result was just weird and I wasn’t able to use it longer than 1.5h. Afterwards I got a (slight) headache, even though I normally don’t get them often. However, if there’s a feature called resolution scaling in the game itself, like in DA:I or the latest two AC, you can use that without any issues because it handles these things differently. You can also go ahead and try this effect by setting 720p on a 1080p monitor. After a while you will notice that something is just off.

As mentioned above that’s hard to say. Firmware updates for the new consoles will be developed the entire lifecycle. Driver updates only focus on the newest generation. Sure, an older generation still gets new features and sometimes the performance is increased, but the main focus is on the newest cards. Furthermore, game companies develop for newer products. Raw game performance does not always win. I think the Radeon VII has significantly more raw performance than a 1080 (maybe even 1080ti, I’m not sure), but it does not outperform a 1080 as it should from a game performance perspective. Secondly, the new consoles will use some kind of raytracing functionality employed by AMD. This probably is were the real difference might be. I doubt the 2070S will be handle to properly handle raytracing in modern titles, but the consoles will be able to. Thus, you could either not care about RT and use a 5700(XT), pick a more expensive card, or purchase a more expensive one.

I disagree. AMD offers a better bang for the buck (3600(X), 3700X) and there is no reason to pick Intel for almost the entire product stack. However, Intel still offers the very best CPU for gaming 9900K(S). But considering he is not thinking about purchasing a 2080ti I won’t recommend a 9900K(S). Furthermore, it still makes more sense to put as much money into the GPU.
Also, I think politics is important as well. In both, CPU and GPU markets, I’d almost always pick AMD over the other for competitions sake. Also, Intel and Nvidia abuse their power and are not very customer friendly. We were stuck on 4 cores with the same prices for generations without AMD. The only reason to pick a non AMD product is when there is no competition in from AMD in this product stack, or if the price/performance ratio is (way?) better. (This is why I picked an Nvidia card for my last GPU upgrade, even though I prefer AMD as a company)

Well i’m not gonna lie, I can’t say I understand what you’re saying about the resolution. You’re simply trying to say 1080p is just not pixel dense enough for 27"?

And I think you’re overthinking the consoles a bit. I had a PS3, and I had a PC with a RV770 4850. It DESTROYED every game I played on both. According to userbenchmark it’s 745% faster (didn’t expect that) than a 7800GTX. Which is what the PS3 used. And lets say we take console optimization into account, and if the CELL was handling some of the graphics workload. That’s a massive difference and not enough to compensate. Not even close. Everyone knows consoles are efficient with performance because of drivers. But again, like I said, even my 960 has been an exponentially better experience than what a PS4 offers. And that’s 60% more performance.

And going back to my new monitor, you’re saying i’m basically left with no choice but to upgrade to play at 1440p, otherwise all games will look like dog shit? I wasn’t expecting that. Some people say it’s bad, others say it’s fine, and others say it depends on how far you sit from your screen? :thinking:

My statement has nothing to do with pixel density. As you know, a monitor’s pixel is often created by RGB colour-crystals. (Honestly, I’m not entirely sure and I also assume it dependes on the panel and the technology used). So, a pixel unit consists of three crystals representing the different colours. Thus, your upper limit in terms of pixel density is bound to the number of these crytals (*). Thus, these three crystals combined form a square. So, by combining four of these crystals packs together you can form another square, but you cannot divide theses crystals into smaller parts. (Otherwise, you could set a 1080p monitor to 2160p, by dividing these pixels by 4). Hence, in order to fit properly they have to combined in packs consisting of 1 pixel unit - the native resolution - or 4 (or a multiple of 4) . So, when picking a 1440p monitor there are 2560 columns of 1440 crytals. There is no way to end up at 1080p, as mentioned above. But 720p should work fine.

*) Bear in mind that it might actually be more crystals for redundancy purposes or things like that. I haven’t read up on those crystals.

Red Dead Benchmarks are out and I may need a GPU upgrade to run with pretty stuff…

Im thinking 5700XT paired either with a 1080p144 or 1440p60 would be the cost effective way to play. No way I’m buying 2080ti with my budget…

I’d definitively get a 1440p display. Believe me, you won’t care about 144 vs. 60 fps…but man…those pixels :slight_smile:

1 Like

I think what @Azulath was trying to say is that integer scaling is a thing. If you are running 4k (2160p) you can scale down to 1080p on a 4k panel because you can use exactly 4, 4k pixels to look exactly like 1x 1080p pixel (a 1080p pixel on a 4k display will be 2 pixels high and 2 pixels wide).

If you’re trying to run 1080p resolution on a 1440p panel, the maths doesn’t work to do an integer scaling for the size. TO try and display a 1080p pixel on a 1440 panel, you need pixels roughly 1.4x the size of a 1440p pixel in each dimension, which clearly you can’t do. So this needs to be fudged by the GPU (essentially it needs to do sub-pixel sampling and modify the output for the monitor to “look” like 1080p via approximation) or a board in the monitor (which does the same thing, but probably less competently) which either consumes additional GPU power and/or causes a blurry picture on your 1440 panel, or all sorts of weird artefacts with moving objects on the screen.

To get “clean” integer scaling for a lower res on a 1440 panel you’d need to drop to 720p (or 360p). Which is kinda chunky, in 2019. But might look better than 1080p on a 1440p display.

All that said, 1440p is a good intermediate step between 4k and 1080p. It’s significantly sharper than 1080p, but a lot less GPU power to drive than 4k. To get super smooth frame rates on ultra settings on all games at 4k you’re still looking at a LOT of money and even a 2080ti can’t handle it on all modern titles.

1440p is much easier to achieve.

For me though i’d rather go for a 4k panel personally and run 1080p on stuff my card can’t push at 4k, so i have 4k for desktop, photos, high res movie content, etc. On a 4k panel, 1080p will scale down “cleanly” due to the integer scaling.

For a lot of stuff, IMHO frame rate wins. I’d rather have 1080p 90 fps than say 1440p 60 fps.

1 Like

Yeah but then there’s the problem of the ridiculous amount of GPU power needed to run a game at 4K, and that’s just 60 FPS. I still will have to see 1080p on a 1440 display. I’ve heard multiple people say 1080p looks great at 27". I’ll just have to wait and see. And the monitor market is so irritating right now, I can’t see myself getting a 1080p display. Even if I wanted one, I don’t think one exists with all the things i’d want, and without making any compromises. 1080p, IPS, 120Hz. Fast. And probably costs $400. Regardless of specs the price is unproportionally high. Still too high, still irritating, still unreasonable and still pisses me the fuck off. It’s 2020 (almost), not fucking 2005.

I might be wrong, but with a budget of $400 you should be able to purchase a monitor with you required specs in 1440p, let alone 1080p. I haven’t checked the prices, but it seemed that way when I bought my new one. (That being said, my price target was different).

I think as long as you haven’t experienced something like 1440p or 4k, you will think that 1080p on 27" is fine. But once you’ve tried it for some time, you will not want to go bacl, believe me. However, therein lies also the advantage, which meams you will enjoy a 1080p 27" monitor if you pick up one of those.

In terms of GPU power, 1440p should be fine on mid-range cards ($400).

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.