I'm attempting to do an AMD build that is more or less comparable to the rig I have listed on my profile to use for gaming and potentially streaming. I've already bought my CPU (FX 9590), motherboard (ASUS Crosshair V Formula Z), case (Fractal Design Arc XL), and CPU cooler (Corsair H100i). I'm thinking about doing 290Xs in crossfire, but I'm not necessarily sold on that as I'm fairly uneducated on the AMD side. Do you guys have any advice on that or any other parts I might need? Any help would be greatly appreciated.
I know crossfire support has improved over the past year but I would check very carefully if the games you play benefit from it. It's a lot of money wasted if that extra 290X is unused 90% of the time and the cash may be better spent on a monitor or input devices etc.
Sounds like an awesome build though :-)
Thank you. I thought about that with Crossfire. My main reason for wanting to do it is that I have SLI GTX 780s in my current setup and I want to be able to show games on an equal or close to equal setup. Most of the comparisons I've looked at show the FX 9590 being pretty comparable to my i7 2600K and the 290X looks to be the closest to my GTX 780s.
Here's everything you need to know.
- CrossfireX has been proven to have less issues as of late than SLI as proven by Ryan Shrout of PCper, however scaling isn't ideal in most games, some scale pretty well, but the only game in existence that has next to perfect scaling is Tomb-Raider. if you had 4 R9-290x cards in your system, and you played Tomb-Raider, you WILL get the performance of ALL 4 cards, which is ULTRA rare for almost all PC games, most games don't even scale well at 3 cards, and with 4 cards you basically would have a 4th card in your rig that just stands there and looks pretty, it really won't contribute to performance.
- AMD's current FX Line is ancient, the R9-290x will be bottlenecked, but it will be a bit minor, you won't really notice it all that much unless you stare a FPS counters all-day.
- If you ever decide you don't like water-cooling your CPU, you can use a Noctua NH-D14 or NH-D15 to cool the CPU i believe Pistol uses one in her rig, and it's kept the PC pretty stable.
- Streaming benefits from the extra cores that the FX Chips have, so you've made a good purchase.
AMD FX line is ancient, and it will not be bottlenecked by the 9590, since 9590 is kind of a better performing CPU than 2600K in most multy threded applications. Being 2015 games have FINALLY begun to use more than one or two cores of a CPU and we have finally arrived at 2008. The ancient 8-core FX line have a long live ahead of it, and being ancient doesn't mean it is not a really good CPU. Bottleneck my a**... And 290X is more powerful than 780 (non Ti), so even one 290X will perform better.
Allow me to educate you,
Yes a R9-290x is faster than a 780 in MOST benchmarks.
FX-9590, is a factory overclocked 8350, if you also were to actually look around or even ask a few people, after around 4.6ghz the "8350" stops showing much of an improvement in performance, unless you are doing rendering or video editing. In gaming, performance really doesn't improve after around 4.6ghz.
Second of all, to quote you "Bottleneck My Ass", Most test benches tend to have a Socket 2011 based CPU, to "remove ALL potential bottlenecks" correct? well the 8350 was released in October of 2012. if you TRULY believe a CPU from 3 years ago is still going to hold it's own to today's benchmarks in games, then maybe AMD would of considered using it to test Catalyst Omega. But sorry to break your heart, They didn't. AMD's Catalyst Omega was tested BY AMD THEMSELVES using a 4960x to show "the tangible improvements" of their graphics cards with the new driver. Most of the improvements AMD showed us on their charts were really exaggerated, but there were some minor improvements and they were ALL tested by AMD using a 4960x.
And my response to your comments,
- My comment about the FX-Line being ancient, that is just factual. it was released in 2012, and we are in 2015 AMD should of replaced the Piledrver FX chips YEARS ago.
- I NEVER said they were bad CPU's you may want to re-read my comment again.
- The current FX line having a long life ahead? HELL NO, if it did, AMD wouldn't need to replace the current 32nm Piledriver Architecture with the alleged 14nm Zen Architecture now would they?
- Games leveraging more cores yes it's getting there, but it's still not here, most games are leveraging at MOST 4-Cores.
Refer to the footnotes in these links.
i can attest to the fact that 4.6ghz seems to be the sweet spot (i have built and tested four machines around the 8350 so I have a spectrum to judge by), it's marginal gain after that vs the tradeoff of massive heat/cooling issues.
I am pretty sure that once devs get to better know the current gen consoles better that pc games will benefit.. not quite there yet though and we have to go through the pain of things like ass creed unity, dead rising 3 and dying light (although dying light is nowhere near broken.. just some performance issues which seem easily fixable :D).
that aside though I am pretty hapy with my current AMD machine, never built an amd rig before this one (was always intel/nvidia) and I decided to go all change and slum it (had more pressing things to spend my cash on, like mortgage :P).
.. in some alternate reality somewhere there is a me typing this on his intel / nvidia rig :D
BTW, crossfire is awesome when it works... sadly seems 50/50 whenever I buy a game.
Agreed with Kat.
Plus, By the time games are actually able to use 8 cores, the 8 core piledriver will be so slow it'll perform way worse still, than the future offerings.
In short. Yes, it'll perform better once games use all those "cores." Will it bring home a kick-ass performance crown? I think that answer is pretty clear already...
There is no more life left in the piledriver architecture for future enthusiasts.
Thanks for the input guys. I really appreciate it. As far as the 290x goes, which ones are the ones that I should look at or avoid?
here's the R9-290X's you can get ahold of in order from best to decent.
- MSI 290X Lightning
- Sapphire R9-290X Vapor-X (This comes with a backplate)
- Gigabyte Windforce R9-290X
- XFX Double Dissipation R9-290X
- Powercolor PCS+ R9-290X (this comes with a backplate)
this is in my personal opinion, the go-to cards. at least until the R9-300 series comes soon.
In my opinion #1 and #2 should be switched. I'd also put the Sapphire Tri-X up there probably between #2 and #3.
What about the 295x? The XFX model is on Newegg for pretty cheap (in comparison) right now.
Two R9-290X cards would be cheaper and less of a hassle. you need to have a bit of knowledge on your PSU if you go with a R9-295X2 because it breaks PCi-E specifications. i don't recommend you get it, if you don't really know the technical details of your PSU.
just go with 2 Stand alone R9-290X cards, it's less of a hassle. or if you TRULY want a "Dual GPU" very badly without having to deal with the PCi-E specs issue, Powercolor has a "Devil 13" R9-290X. (it's labeled as a R9-290X but it's two R9-290X GPU's on one PCB) but word of advice the card is fucking MASSIVE and heavy as shit. oh and it comes with a free mouse =)
That's crazy as hell. I'll keep that in mind, but I'm kind of leaning towards the MSI Lightning. It seems like the cheapest option and has a $30 rebate on top of being $70 off on Newegg right now.
Well, I wouldn't put the R9 295x2 too far down there. With two R9 290Xs, you usually have issues with thermal throttling due to the heat output of one card going into the other card. The R9 295x2 with it's all-in-one cooler would alleviate these issues.
his build is ATX, he has enough space between PCI-E slots for the cards to breathe.
1. i agreed it is ancient...
2. i never said you did.
3. ancient and it is still good cpus, and soon the games will use more then 4 cores, and the people with 8 cores will be able to still use their ancient cpus. this define long life for me...
4. i agree. welcome to 2008... Still FX is good choice for the price, and there are cores left for other stuff...
As for 2011 socket - gaming wise, 2011 v3 CPUs show 1-3fps advantage over 8350... check the benchmarks again...
You're missing the point.
YOU said there is no bottleneck, i'm telling you there is, end of discussion. i never said the CPU's are bad, they are excellent CPU's for the money, BUT they will bottleneck a 290X which is fact. if there truly wasn't a bottleneck like you say there isn't, then AMD would of tested their drivers with the 9590 but they didn't they went with a 4960X. to remove all potential bottlenecks
there is a reason there aren't any AMD based test-benches on tech youtuber's channels. and it's not cause they hate AMD. AMD hasn't dropped a CPU that destroyed Intel in performance in years.
AMD Piledrivers really come in to their own and strut their stuff on the Linux platform. That is where they do their best work and really show the power of AMD. Windows is so heavily Intel biased that they had to write patches to make Bulldozer mostly work right. so, when building a Linux rig, even for gaming in Linux, I recommend an AMD chip.
The 'Bulldozer' patches were for Windows 7, why?
1) Before Bulldozer they had only optimized for hyper-threading and the windows schedulers would try to avoid putting work onto the second logical processor if the first logical processor was busy. Windows 7 did not recognize the AMD FX as Hyper-Threaded and so would load up both cores of a module as though they had no shared resources (cache etc.).
2) The Core parking function in Windows 7 was a little aggressive under the default power profiles and continually parking and un-parking cores wastes time and saves little power.
Windows 8.1 has never had these problems but (in my opinion) still suffers from sub-optimal scheduling on the i7 and FX-8xxx which can be resolved by manually assigning logical processors to a process or using a tool like process lasso.
The Linux kernal schedulers are in my opinion better but don't expect miracles that will give an FX-8350 superior single-thread performance than any Intel since Sandy-bridge, sadly they won't.
As far as gaming is concerned Linux is still as ham-strung as Windows, as soon as you are running a graphically complex game the GPU and not the CPU is the bottleneck. All Nvidia and AMD drivers are essentially single threaded as far as moving data to the card is concerned. OpenGL may offer multi-threaded support but all the threads have to recombine to get to the display driver, so efficent usage of the FX 8350 or an i7 by Linux gets wasted in a sense.
I've done a fair amount of testing with X-Plane X on OpenSuse 13.2 and Win 8.1 now and get around 20% more FPS in Linux when graphics settings are low and the workload is on the CPU. I then loose the benefit once the graphics card is really put to work - to the point that Linux is only 1-2 FPS faster. I can still see from the timings captured that the CPU is sending data to the card more efficiently but the return diminishes the harder the card is working.
I would imagine that if Bohemia were to release ARMA 3 on Linux you would see similar results.
Fascinating stuff though :-)