My Crazy Pills aren't strong enough to help me decide what kind of server setup is better

I feel like I’m going crazy trying to figure this out. I currently have a ProLiant ML350 Gen 5 server and need to upgrade so I don’t have to keep buying new HBA’s or PSU’s that break every year.

I was able to purchase a dozen U.2 drives and a NetApp disk shelf. I hadn’t realized that NVMe wasn’t compatible with SATA at the time. I also purchased a 1000W power supply, IceGiant cooler, and iStarUSA D-400L-7 case for a pc gaming build that I had to scrap due to the pandemic.

Now I’m at a crossroads.

Do I go for a DIY Threadripper Pro in a 4U build or a barebones 2U enterprise server build with a separate gaming PC.
Note “I would buy a 2U server for an all in one but that isn’t likely to have enough or any full sized PCIe slots left over.”

Threadripper Analysis:

D-400L-7_07

Pros:
	* I can have my gaming and general server be the same machine (so I save space)
	* It’s a 4U so I can add any Full Hight PCIe card / GPU without issue
	* I can build the system over time. (Not needing to buy all the components at once makes it easier to afford)
	* I get to use the parts I already purchased.

Cons:
	* Dealing with the headache of managing NVME/PCIE in general
		- No relatively affordable way to mount four 15mm U.2 drives in a 5.25” bay.
		- Cables and PCIe cards for NVMe can be shoddy and often only work with qualified systems (Obviously DIY builds are not qualified)
		- I need to cable-manage all of this in a generic case

2U NVMe Server Analysis:

500

Pros:
	* I don’t have to pull my hair out due to some component breaking or it only having PCIe-X or PCIe Gen1 slots.
	* NVMe and PCIe would be validated so I shouldn’t have too many issues with it.
	* Built in hotswap
Cons:
	* It’s an expensive purchase all at once 
	* Can’t fit a full-sized graphics card in there.
	* I would need to spend even more on a separate gaming machine.

Overall:

Both are expensive and will give me headaches in different ways. Do I go for the all-in-one jankfest that will possibly save me money
or the nice but crazy expensive 2U server with separate Gaming PC. It is a tough choice to make.

Like I don’t need crazy performance, but I do need a lot of PCIe bandwidth.

I really hope I can find a solution here or I’m going to have to give up on it again for another year.

The first option would probably be the faster and cheaper one. But now comes the hard questions that nobody (yet) asked:

  • What are you planning to run on it?
  • What are your actual requirements?
  • Why are you building this for home use?
  • Do you already have a rack, or was the ProLiant just sitting somewhere on a shelf?
  • How often are you going to swap drives, that you require external access?
  • How likely is that you would need to swap drives and does your system really need to be powered-on while doing so (i.e. do you really need hotswap)?
  • Would 2 smaller builds that can be powered-on independently fit the bill better?

If you haven’t guessed already, I’m here to force your crazy pill down your throat help you administer your medication.

In my own infrastructure, beside starting to split everything in multiple ultra-small devices (ARM SBCs), I try to keep things that don’t need to run 24/7 ackshually turned off. For example, my virtualization box and gaming vfio VM is running on a TR 1950x. When I don’t have to run the TR, I keep it power off. It hasn’t happened for a while, because I’m a long process of transferring data and only the TR was available for it. When that is done, the system will be powered-on on-demand when I want to game or test some stuff.

All the rest of my infrastructure will run on SBCs. I am not saying you should go full ARM, but consider your actual needs. What needs to be on 24/7? Do you have a Plex / Jellyfin server? Do you have a NextCloud server? Are you hosting a website? Email?

I highly doubt you need the gaming PC 24/7, unless you are playing some MMO and are botting to farm stuff. Having it separate can save you some headaches (and money on the power bill). Even if you plan to get a baller GPU, you can still do a SFF build and get away without using as much space as a 4U case for 2 separate builds.

Unless you have a specific need for lots of cores and high frequency, I suggest you build a 7600x / 7950x build (depending on how many cores you need) and a separate lower-powered box, like an Athlon 3000g / Ryzen 3400g or an i3 12100. These options have built-in GPUs to handle encoding / decoding and should support 64 GB of RAM (even ECC Unbuffered for AMD).


As for the u.2 drives, you probably won’t have an easy way to mount these externally on 5.25" bays, but if you buy any case with support for 3.5" drives, you can adapt from the 2.5" “fat form factor” to 3.5". And if you want to have them easily detachable from your system, for whatever reason, buy a case that has internal trays, like the Antec P101 Silent (not SFF, but it’s a massive case, I have it and love it), the case is easy to open and the drives easy to access.

I believe Wendell showed some PCI-E to u.2 cards on which you can fit the drives themselves, like 2 drives per x8 card, but I could be wrong.

1 Like

This is the cheap way to go. It will block 6 slots of the Threadripper mobo for a dozen nvme SSDs, though.

What do you need all that storage performance for again?

Depending on the requirement, maybe the investment in a solid hot-swappable chassis is warranted.

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.