GPU Advice?

I think that’s were AMD want to go, but I do think low end descrete graphics is important. If I find an old ivy bridge extreme edition for cheap on eBay it would be nice to be able to turn it into a budget 1080p 60 machine, and at some point that igpu is likely to feel is age before the attached CPU. Spending money on a high refresh 1440p card seems wasteful for someone who spends a bit of free time in their workstation gaming at 1080p on their 24" display. I would like both to be viable options.

That said, what they said about having for outputs from the igpu from the motherboard was neat. I only use one monitor for gaming so buying a gpu just to get a display in the other three would be stupid.

Yes, and no. Look at the bottom scrape of the barrel this gen, the RX 6400. only 4GB, no video acceleration, performs like shite on a 3.0 system and still costs $159+. If AMD went lower here we’d be talking APU territory.

To be fair, the RX 6400 could have been decent if they included an x8 → x4 MUX, as well as hardware decoders on the card. As it stands though, the cheapest you want to go currently is $299 for a 6600. Currently the low-end is just not worth it.

The 3050 is not a low-end card either btw, the fake $250 MSRP ($329+ in real world prices) makes it too pricey for that. Especially when you can get a 6600 for $299, and a 6600 XT for $369.

Low end discrete GPUs will be wiped out by either DDR5 or DDR6 and APUs.

High end discrete GPUs (at least for consumer) will be wiped out for the mainstream a few years later.

For the low end we’re getting to the point where system memory bandwidth is fast enough, especially as demonstrated by apple that using system memory avoids a bunch of copying around over the bus which is electrically and processing expensive.

Apple’s the first one to really show that with the M1 Pro/Max, but you can be sure that as higher bandwidth system memory becomes more commonplace the others will follow, especially for the low end.

Hell, even for the high end, accessing half a terabyte of system memory for large workloads is going to be a lot more competitive than trying to do it over a PCIe bus on an discrete GPU eventually.

Nvidia put out a video a few years back that said (paraphrasing) that bandwidth is everything and that the processing was effectively “free” in comparison to the bandwidth requirement for GPUs. If you can cut bandwidth by not copying everywhere, you can get away with slower memory in certain situations (those where the data is larger than your expensive VRAM). Because you’re not wasting some percentage of your bandwidth on just shuffling data around.

1 Like

You’re talking about cards that in a sane market would be $150. These are 750, 750 Ti type things being sold as if they’re GTX 1060s. You’re right that they are bottom of the barrel shit, I take major issue over that last couple of gens that they have not been priced as such. Turing’s price gouging followed by a chip shortage have emboldened Nvidia and AMD to sell us barely-better-than-integrated-graphics for the cost of an RX 480.

So in a couple of years when your 7600 XT can no longer produce playable framerates at 1080p despite the CPU sitting at 15% load you just… chuck it away and buy a new CPU, possibly with new motherboard and RAM depending on how long you’ve left it? Not everyone is going to need a card capable of 1440p 144hz, but people content with 1080p 60 are still going to have to upgrade at some point.

Well, to be fair, the real problem here is that GDDR6 is waaaay expensive and GDDR5 is no longer being produced. GDDR6 is something like $20 per GB to source, so that alone is $80 for the RX 6400. Add the cost of the chip, all electronic components, board, assembly and so on, and $150 becomes an extremely tight margin - and two RX 6400 have the equivalent RDNA2 chiplets for one RX 6500 XT, a $199 card. It would not surprise me if $140 is the break-even point for the RX 6400 and $170 is it for the RX 6500 XT. Remember, MSRP is what it costs to make the card + manufacturers cut + retailers cut.

For the 3050, The GDDR6 alone is $160, the break even point is somewhere around $275 from what I’ve heard. Nvidia set an artificially low MSRP to make the RX 6500 XT look worse than it already is, and it’s kinda coming back to bite them now when the 6600 and 6600 XT go back to their MSRP.

So while I agree prices are not what they should be, it’s not just scalping here.

Those numbers were at 1440p so at 1080p it should be significantly higher. Interesting that you mentioned FFXIV as I actually started playing it this week. Unfortunately it cannot maintain 60fps at 1440p but that is with proton. I am sure at 1080p resolution it would be a much better experience.

I’m not really considering the upgrade path. Sooner or later the system will need full replacement anyway, and at that time many won’t need discrete GPU.

Discrete cards will still exist, but I suspect that they’ll be relegated to enterprise stuff. But maybe not even there because as above, moving data around is expensive.

It may take a few generations but it will happen eventually. I’m old enough to remember math co processors :smiley:

When was the last time you saw one of those? :slight_smile:

This looks like my GTX 970, priced like 970 brand new, and likely performs like one. The only difference is it doesnt even have NvEnc capabilities. I am looking at 6400 like a bad reincarnation of 970 and mine isnt even dead yet.

I was hoping to pay today for roughly the same price, expecting the same generational line equivalent improvement over time but no. Crypto boom+covid+silicon shortage+potential WW3+recession+inflation complely FUBARed it.

Ive almost given up on AAA gaming. These days, its the indies making the good gameplay innovation. The rest is just playing catch up to the innovation and literally just offering MUCH WOW GRAPHICS! And nothing more. The “Fee to Play” model (basically paid games with free to play monetization) isnt making games any better these days.

From Software not fixing PC online multiplayer for DS/DS2/DA3 for the sake of having an active Elden Ring playerbase is not making it easy for me to like them. (Or buy Elden Ring for that matter). I promised to finish DS3 with an online component before I buy Elden Ring. But it looks like things wont be fixed and therefore I wont be buying Elden Ring. No new games, no need for new GPUs, at least for me.

That is so unnecessarily wasteful, though. Someone buying a 7800X for gaming on release day will be getting multiple GPUs’ lifetime out of that CPU because that’s just what happens. My i7 3960X, a six core from 2011, saw three GPUs used in its lifetime. It would have seen four if my use-case stuck to gaming, maybe even more. Six Sandy Bridge cores are still that competent a decade on, I would expect any six-or-more core CPU on a modern architecture to similarly last a long time.

I doubt this will become the norm outside of the extremely low end. I don’t think anyone wants to make a CPU pushing 1000W for both the CPU and GPU combined, and for Nvidia that would require supplanting x86 with ARM altogether for the home gaming market. That may happen eventually, but that’s a long term achievement at best.

I feel like that’s not really our problem to solve, and pricing out the mainstream market for PC gaming altogether is not a solution. Think about it from Nvidia’s point of view, you’re saying that their solution to the ~$200 market is “buy a new CPU, motherboard and RAM from AMD”. Not only is that way more expensive all in all, not only is it extremely wasteful when the CPU component still has years’ worth of use in it, but it involved Nvidia directing consumers to their main competitor. All for the sake of memory that is far too performant to offer any benefit on this tier of GPU.

1 Like

Oh, for sure. AMD is still abandoning the low end though, as evident with the 6400 and 6500 XT. They just cannot build anything cheaper.

Let’s see what the rumored GTX 1630 bring to the table here, a Turing GPU in 2022 feels like pretty meh, but if the rumors of 950 performance for $149 is true it might work.

Turns out it’s moot: most of the 7000 series only has two CUs. That’s not enough to do anything meaningful with as far as gaming goes.

To add to that, seems like both the 1630 is a thing and a 6400 XT, 6500 or 6550 XT is on the way, so maybe low end isn’t screwed over completely after all:

I love it when my pessimism turns out wrong!

Yay!

idk how you can be disappointed by the 1630. My expectations are already extremely low lmfao. Honestly at this point at the risk of repeating the GT 1030 incident I’d be tempted to see what you can achieve by putting DDR4 on something like the 6500 if the issue is price. How limited by vram bandwidth can a card of such a low tier possibly be?

I was thinking the 1650 was going to be the future 1030 as things evolved over the last years. Supply and demand has been so out of sync and askew a 1630 might just be something in the pipeline 3yrs late.

No humble brag but I cant believe physical inventory of gpus right now -MC.

From brand new 1050tis and 2060s to 40someodd different skus $300-500 now. The demand missed the inventory and now new shits coming through in months. AFU!

I never expected APU-like capabilities. If it will be enough for basic display needs, I’m ok with it. I’ve also seen some statement that it will support up to 4 displays. And without the compromises that come with current APUs. If all SKUs have this, then Motherboard HDMI/DP always work.
Excellent for people building a server or “just want desktop”

1 Like

I think we’re talking about a couple of different things here.

I’m not talking about immediate upgrade paths, I’m talking about where the industry is headed in say 5-10 years.

If you buy a new system in say 5 years time, chances are there will be integrated GPUs that are capable enough for 90% of users to actually game (properly) with.

In 10 years time i do not expect discrete GPUs to be a thing at all outside of ultra high end/scientific workloads, etc.

You won’t see 1000 watt CPUs, you’re going to see CPU/GPUs in the next few generations being fast enough for what most people want inside of 100 watts - or more likely, 15-30 watts.

The ultra high end PCMR discrete GPU thing is going to become more and more niche.

You only need to look at what the consoles are doing in 300 watts with 2 year old APU hardware today, and what the Quest 2 and ipad, etc. are doing in 10-15 watts already to see where things are headed.

It’s taken a long time, but like everything else, GPU will end up becoming integrated into the CPU core.

And eventually, CPU and main memory will merge too. But that’s probably at least 1-2 decades out just yet.

The conversation was about it replacing 1080p discrete cards, and hence the lack of sub $200 cards

No we’re not talking about different things, I’m telling you the implications of what you’re saying.

If you buy an APU that is good enough for current games at 1080p then two years down the line your will be updating your while system. They is especially so for 1440p, 4k and high refresh gamers who will also be locked into an entire platform. The upgrade needed just to change graphics card when everything else has years of useful life left is just obnoxiously wasteful. It’s not going to happen: sustainability is a bigger engineering buzzword than it has ever been. You are also envisioning a world where all of this having is being done on ARM because Nvidia are not going anywhere and they are not getting an x86 licence.

The idea that all gaming is going to take 30 watts is a pipe dream. The consoles don’t even for that. You’re expecting the entire games industry to just stop pushing forward, stop pushing fidelity and graphics and physics and to just settle with keeping the same performance as we have now just at lower and lower power levels. Nothing has given me any indication that this is going to happen. Also you are kind of implying that 1440p high refresh users and 4k users are just going to be given the finger as they are pushed to a PS4 experience just as the current gen consoles are moving to more regular 4k and high refresh gaming.

I really do not think your prediction is likely, and it would be the final death of pc gaming if it did

I’m pretty sure we get dedicated SKUs with a more sophisticated iGPU implementation later on. I’ve seen some related news and rumors in the last few weeks around that. They all state 16 compute units. That’s definitely more than that small I/O die iGPU. And this could be competitive to the worst discrete GPUs out there.

Then we’re back to ewaste and how expensive it is for people on something like an RX 460 to have to buy a new cpu, motherboard and ram for the sake of a £150 graphics card upgrade. These people are only after 60 FPS; any Intel CPU released in the last decade and any AMD CPU released since 2017 will be good enough for that.

It’s pushing the cost of a low end upgrade up to like $500, which gets you comfortably into the mid range discrete market. There needs to be reasonable £150 cards like the gtx 960 and 380 again

Regarding e waste I would argue it is a lot more wasteful to replace an entire GPU card as opposed to a single CPU chip every 3 years.

If AM5 lasts for five gens then in theory you could do a CPU swap for gen 1, 3, 5 and that is all the E waste that PC would produce over six years.

Of course, in theory. Will that hold in the real world?

1 Like