Server build critique

Hey all,

I would appreciate anyone willing to provide any constructive criticism on this build for my home lab.

Long story short my boss wont let me take any equipment home and as much as I can play with things at work, I can't get as in depth as I would like. Plus any self respecting IT guy should have his own playground at home anyway.

I plan to add another E5 and more memory in the near future, my thought process is that i can limp with the higher denominations of memory and 1 cpu now and then bring the the rest of the parts and have a pretty powerful system when it's all said and done.

-Hyperthreading is a must for me and dual cpus
-IPMI is also a must
-I chose the board because i will be experimenting with GaaS/vSphere/Docker/Proxmox/blah blah blah
-I know the board fits in the R5 and i will transfer it to a 2U/4U case at a later date and re-purpose the R5 for another build i'm planning
-I'll worry about redundant powersupply when i transfer it to a rack case, I'm not concerned as it will be behind a UPS
-Also have an LSI card and drives on hand

http://pcpartpicker.com/p/3hPtP6

My budget is 1500 USD.

Thanks for your time.

I take it you just need one powerful PC and not a render farm?

Yes, just for now as a proof of concept and I know this box will get re-purposed for other tasks as things evolve etc.

I only have a thing to say about the power supply.
You might as well save yourself a "few" pennies and grab a new 80+ Gold instead of a good but old, 80+ Platinum.
Pretty much anything current 80+ Gold is on par with the old 80+ Platinum platforms.

For example the EVGA 1000 G2. It's a heck of a nice PSU for a lot less than the 1000XP you've chosen. Almost identical. There's differences in the cables though. And unfortunately it's not semipassive.
http://pcpartpicker.com/part/evga-power-supply-120g21000xr
Or perhaps the EVGA 1000 P2 if you want 80+ Platinum, it of course has the semipassive mode, stays fanless until about 50% load.
http://pcpartpicker.com/part/evga-power-supply-220p21000xr

Anyway, there's plenty of choices out there you should look at.
If you want to read reviews you might want to look at for example Techpowerup, Jonnyguru and so on.
Realhardtechx also has a power supply review database.


Well, I might as well talk about the CPU cooler.
Perhaps you could grab the 4U version of the Noctua? The NH-U9DX i4.
Kept a 6core 5930K @ 4.2Ghz rather cool, at 50% fan speed only 10C worse than NH-D15 despite being quieter.
http://www.tomshardware.com/reviews/dynatron-r27-r24-noctua-nh-u9dx-i4-cpu-cooler,4168.html
The little brother is perfectly capable of handling that 85W TDP Xeon you chose, a 6core with 2.4Ghz base/3.2Ghz turbo, while being quiet. Probably not 100% silent but shouldn't be far off.

And possibly save some money in the long run?
If you end up going with 4U rack case, voila, you'll have 4U compatible cooler instead of having to replace the big Noctua down the line.
If you end up going with 2U rack case, well, you're gonna have to replace the cooler whether you have the 4U or the big Noctua.

Easy RAM access or VRM heatsink interference? This is a bit of a coin toss.
I don't know how tall those VRM heatsinks are. Might be only as high as the RAM sticks are going to be.



http://www.mod-your-case.de/index.php?forum2-showposts2-353

1 Like

EVGA 1000P2 it is! The PCI-E cables are out of control color wise but saved me a chunk of change and now I can fit a Corsair 750D case in the build.

I have visually confirmed that the EEB board will fit in there. As for the cooler I'm going to bet that will interfere with my ram and/or the second cooler when i install the second cpu. I know for a fact the slim cooler I chose fits.

But I am curious so I'm going to look into it. If it's a no go it's really not that big of a deal to use the coolers elsewhere and find an appropriate replacement.

Thanks very much for your helpful advice.

As you can see it's right on the money. http://pcpartpicker.com/p/83TdK8

You might just be better off getting used off of ebay. Most of the guys on my team (and me for a while) who have home labs will get gear that's 1 or 2 gen's old to play with.

An R710 is a good piece of tech and can be had in good configs for relatively cheap:

1 Like

Yeah, the PCI-e cable colors being red is a bit...but oh well. (did you decide to go with the P2 or G2?, the P2 isn't all that much cheaper though it does have some MIR and promos going on, dunno if that counts as chunk of change)

Also the 750D should easily eat the SSI-EEB motherboard you've chosen since it does support E-ATX, which is of same size.
But, not all of the screw holes line up.

EEB does not equal E-ATX in case someone else wanders here.
Here's a quick mockup with Corsair 750D side view (lists support for E-ATX and XL-ATX) and Asus Z10PE-D16 WS (EEB formfactor)
Blue for EEB and red for Corsair 750D mounting holes. Thankfully there's at least one mounting hole that lines up in the top right corner.

(I have not had the pleasure of handling an EEB motherboard before NOR have I ever seen a 750D in person. So take this with a grain of salt. But the scale should be in the ballpark.

By the way, I don't know how you originally were planning to fit that board inside the R5?
If you didn't know, the motherboard tray is at an angle where those cable management holes are located.


Some just a tad wider than ATX boards probably fit fine but a full blown 13 inch wide EEB most certainly does not fit the Define R5.
I have an R5 and took the 13 inch (33cm) measurement just for the giggles. It's around that red mark in the picture.

Regarding the cooler. I don't think the RAM clearance is gonna be the issue with the U9.
More about the ease of access to the RAM. See those pictures in the 4th post.
And if the push or pull fan of the CPU1 or CPU2 is going to hit the VRM heatsink (which height I do not know).

But if you are considering it, why don't you ask Noctua? [email protected] or http://noctua.at/en/contact
If you're going to, might wanna also poke Asus and inquire on the height of those VRM heatsinks.

There's of course the 3U tall NH-D9DX i4 out there. It certainly won't have any clearance issues whatsoever.. But it's gonna be slightly louder. A bit less surface area and only one fan (NF-A9 max 1550/2000rpm vs 2x NF-B9 max 1300/1600rpm).
http://pcpartpicker.com/part/noctua-cpu-cooler-nhd9dxi43u
At least in Xtremesystems forum user review this thing kept the 8c16t (2.4Ghz/3.2Ghz) E5-2630 v3 at 50C under Prime95 load BUT that's of course with the fan running 100%..
But is the D9 gonna be insanely loud with your 2.4Ghz hexacore vs the U9 or even the U12? I don't think so.

Being constructive.... I think your budget is pretty high. I have been more than successful getting 15 virtual machines running off of 32 Gigs of ram and an AMD 8350. This is running on a $60 mobo. If it is just for testing purposes and you want to run through software configs and will be reloading the box time and time again, I wouldn't bother with a dual CPU configuration. Production environments are different. You need the server hardware for reliability. Consumer grade hardware is perfectly fine for testing.

On the other hand if you are getting this server hardware, like the LSI card because you use it at work and want a better understanding of how to set it up, and how to recover from a failed drive then sure go for it.

I personally like to work with crappy/cheap hardware in my test environment. It is a greater challenge and promotes optimization of the setup. For simply testing setups and experimenting, there is no need for fancy server hardware. Considering this is a learning environment, all the extra horsepower doesn't help you learn any faster, it just costs more money.... and I assume your purpose for learning is to eventually make more money.

I do have to agree with you on the IPMI. It is really nice to have. I would suggest a Supermicro X10SLL-F This is not a dual xeon board but it is an Intel board, server grade and far less expensive than the ASUS Z10PE-D16 WS. I use these in production environments all the time for SMBs.

Good luck with your build!

The image for the r5 (with the red mark) is no longer hosted at that address, however it IS, for the time being, still on archive,org

cheers borisi i spent 20 mins composing an answer to this. only to realise as i posted it was a 7 year old necro’d thread…

:frowning:

@SgtAwesomesauce any chance you can lock this?