My last GPU upgrade was in 2019 when I got the Radeon 5700XT, and I think I’m ready for an upgrade to a 9070(XT) this year, or if the price is too high initially (closer to €600 than €500), maybe next year.
All of the cards should feature “optimized compute units”, new ray tracing units (and maybe more of them per CU?), “supercharged AI compute” and new display engine for better media encode/decode, and should be based on the TSMC N4 node. They will also feature machine learning based upscaling for the first time on AMD cards with FSR 4.
The top end of the cards seem to be targetting 7900XT/4070Ti level performance at about 7800XT level power consumption and price (based on rumors), and 7600XT/7700XT/4060Ti level performance with the 9060 series.
My hope is that the 9070 series ends up about twice as fast as my 5700XT, which it looks like it will, and can do decent ray tracing. With 16GB vram it should be a nice upgrade from the 5700XT, if the price remains within reason.
I will update this post as new information is revealed.
Looks like an interesting upgrade, however I just watched the NVidia CES presentation and they straight up murdered Radeon in a way I’ve never seen one company annihilate another company pretty much in the history of ever. I mean execution style.
At a minimum, I’d at least look at reviews and benchmarks and see where the price / performance between the 5070 and the 9070 XT land to make sure you are getting the full bang for your buck. Hopefully after NVidia’s announcement the 9070 XT prices closer to $399. If not, it will probably sit on shelves until it does.
I think an important thing to keep in mind is that the graphs they showed were using dlss performance. So unless you are shooting for those high fps numbers, I don’t know if that is really a good comparison to look at what is going on.
That 5070 at $549 is real fighting talk. Whatever’s actually paid by system builders or retail prices, this looks like nVidia will price aggressively.
I like the Linux amdgpu driver so I’ll wait and see how much RX 9000 I can afford. Most of my screens are capped at 60Hz, too, so I’ll have to rethink what’s meaningful for gaming at 1440p/2160p but capped at 60FPS.
All I saw from nvidia was aiaiaiaiaiai. It’s not very promising to me that the only comparison they are willing to make is ai tops numbers, which mean absolutely nothing to me. So far the only gains they have showed over the previous gen in gaming are with multi frame generation, which means that 3 out of 4 frames are ai generated garbage, and that is not of the least bit of interest to me what so ever.
I do hope that AMD will price the 9070XT at around $500 max now tho, since they will probably have the weaker card, at least with all the ai nonsense taken into account.
I was disappointed that they didn’t show RDNA 4 at CES but it makes sense if they only had 45min to talk about everything they wanted to. RDNA 4 probably deserves the 45 minutes on it’s own.
a 70 class card at $550 considered “fighting talk” is just woefully depressing. Last time I bought nvidia, the 70 class was where the 80 sits now, and in the 300s. Mining scalpers pushed it to just cracking 400.
Everyone picks favorable titles for comparisons. If you make decisions based on their marketing, you’re doing it wrong, and everyone should know this.
But no, AMD isn’t “just as bad” . They don’t claim 4X the actual performance of their GPU based on fake frames while trying to disguise that fact as much as they think they can legally get away with. They’re not trying to sell you the idea that they just got 3x cheaper for the same performance.
AMD’s sin this time around IMO is not even trying in the high end (unless they’re seriously sandbagging I guess). If there was a decent uplift from the 7900xtx, I might have done a single gen upgrade for the first time in my life, but I will NOT be giving nvidia one or two thousand dollars.
this.
super weird mentality but a lot of people, even though they are buying a midrange card, will not consider AMD because NVIDIA makes the fastest high end card. That they are not buying.
i have said this before, but humans are weird, i do not understand them.
I know there have been some videos highlighting some of the aib cards, but we haven’t seen any versions with a water block, right? This gives me some hope that we will be seeing a higher end model with some versions that have a block. The counter argument for that is that the three 8-pin power connectors on some of the cards means it will be the higher end of the power profile regardless.
Supposedly they are letting AIB’s use a lot more power if they choose. Up to 315W, IIRC, from the stock ~260W TBP. This is all rumors at this point but makes sense seeing a lot of the OC models seem to have the three 8-pin connectors or some even having the new 12-pin.
I think for Nvidia we will have to wait probably almost a week past launch day to see objective raster performance reviews. All the launch day reviews will be with free review samples sent out to the main places and Nvidia will be strongly pressuring all those places that received samples to only show performance with RT and DLSS. A raster vs raster would show a very bad light and Nvidia will not be wanting that.