Replacing R710 & Mini PC with new build (ZFS, ECC, etc.)

Hello! I’m looking to replace my Dell PowerEdge 710 & Steam Machine ASM100-6980BLK with a custom build. My ultimate goal is to consolidate the systems into something that uses less power and is not as loud at the R710.

TLDR Version: I’m looking for a Motherboard & CPU with ECC RAM compatibility that can handle 4 SATA drives, 4 SAS drives, 1 PCIe NVMe M.2 SSD, 1 M.2 Coral Edge TPU, and has 4 RJ45 ports or room for a card to be added. It will be used as a NAS/Media Server with 2 Windows VMs, and a bunch of docker containers.

In-depth Version

Current setup:

Dell R710 - Proxmox VE

  • 72 GB ECC RAM
  • x2 - Intel(R) Xeon(R) CPU X5660 @ 2.80GHz (2 Sockets)
  • x6 - 4TB 7.2K 6Gb/s 3.5" SAS HD (RAIDZ2)
  • x1 - 1TB Samsung SSD 860 EVO (OS)

Steam Machine - Ubuntu LTS

  • 8 GB RAM
  • x1 - Intel(R) Core™ CPU i7-4785T @ 2.20GHz
  • x1 - 1TB HDD
  • x1 - GeForce GTX 860M

Budget: $2k USD
Location: US

Needs: I would like a single PC that will host my NAS, 2 Windows VMs, and minikube/docker. I’m currently running 3 Windows Server VMs—two are app servers and one is a web server running IIS. However, docker is now supported so I will remove one VM and run the tomcat web server with docker. Other containers I’m running include Traefik, Plex, Jellyfin, AdGuard Home, Home Assistant, MotionEye, n8n, Mosquitto, Grafana, Influx DB, Chronograf, Metabase, Joplin, Mealie, some of the *arr stuff, etc.

I don’t need any peripherals and do not plan on overclocking anything.

I have the following available:

  • x2 8TB Seagate IronWolf NAS 7.2K 6.0Gb/s CMR 3.5" SATA HDs
  • x1 1TB Samsung 980 PRO PCIe 4.0 NVMe M.2 SSD
  • x3 400GB Pliant LB406S 2.5" SAS SSDs
  • x2 32GB 3200MHz DDR4 ECC CL22 DIMM 2Rx8 (Unbuffered)
  • x1 LSI SAS2008-IT Controller

Thoughts:

Media Storage: I currently have (and do not need) 14TB of usable storage. I am considering just mirroring the two 8TB SATA drives I have available and use that for my media/file storage.

OS: Mirror two of the 400GB SAS drives for the OS. I was never able to setup a mirrored boot drive on the R710… Maybe this time.

Cache: Maybe use the 1TB NVMe SSD for this? I haven’t setup a L2ARC before, so I need to read up on this.

VMs: I am considering pulling two of the 4TB SAS drives I have in the R710 for my VMs and mirror those as well.

Case: I am looking at the Fractal Design Meshify 2 XL, but I would consider a short depth rack mount case as well. I guess it depends on the motherboard and what works best for cooling. I have a 22U server rack, but I would like to downsize it when I replace the R710, and APC Smart-UPS 2200. I also need to get rid of a R420 and HP DL380 G7 that I no longer use. Once those are replaced/removed then deepest device I have is 18".

Motherboard & CPU: I’m looking at the AMD Ryzen 9 5950X, but I’m not sure what motherboard would work best. If the HD/SSD setup from above looks good, then I need a motherboard that can support those SATA & SAS drives + two more SATA drives in a couple of years. I will likely need a new CPU & Motherboard before I can use all of the HD slots in the Fractal case.

I would like to also add in one of the Coral TPU M.2 accelerators when they become available again. It would be passed through to a Frigate docker container.

Ideally both CPU and Motherboard would support ECC memory.

It would be nice to have x4 1Gbps Ethernet ports. None of my current switches support anything beyond 1Gbps at this time. I don’t see myself upgrading those anytime soon.

So, what are your thoughts? I looked at a handful of EPYC CPUs, but I think the CPU and Motherboard would be outside my budget. I also started looking at the Alder Lake CPUs (specifically the Intel Core i9-12900), but I can’t find any Motherboards with the Intel W680 chipset for ECC compatibility. I didn’t bother looking for DDR5 ECC RAM.

Thanks in advance for any insight!

I should probably mention OS. I haven’t decided between Proxmox, TrueNAS Scale, Debian 11, or Ubuntu 22.04 LTS.

I like TrueNAS Scale’s interface for managing ZFS, but I don’t care for using the interface for apps. Most of my docker containers are deployed using ansible. Proxmox has worked great, but I’ve never been able to set it up with a mirrored boot drive. Either way, it’ll be something based on Debian.

For similar requirements I use (and can recommend)

  • Motherboard ASUS Pro-WS-X570-ACE
  • AMD Ryzen Zen3 based CPU (I use 5900x)
    Combination supports ECC. MB has 3 16x PCIe Gen4 slots that can work in 8x/8x/8x. That’s good for expansion and as far as I know more usable 8x slots than any other x570 based MB. The MB has a M.2 port for the SAMSUNG 980 Pro. You can run the Samsung SSD 860 EVO off a SATA port, or the LSI SAS2008-IT Controller. MB supports bifurcation in all PCIe ports, meaning you can think about adding NVMe drives in the future. MD has two 1G network ports. I turn the one with the fake BMC (doesn’t work in Linux) off because it delays boots by a min and instead have a fast network PCIe expansion card. You can get a 4x 1GB Ethernet card if you fancy (I have no experience with those).
  • Get a SAS expander card e.g. Intel RAID Storage Expander RES2SV240
    This will allow you to connect all of your current and possibly future HDs both SAS and SATA. The benefit of this Intel expander is that it can live in a PCIe slot (and will take power from it), but it does not require a PCIe slot
    and can instead be powered off a 4pin connector.

You already have case and RAM and could be set. The total would come up to
~$400 for motherboard
<$600 for CPU (even if you spurge :slight_smile: )
~ $200 for SAS expander+SAS breakout cables

That’s way below your budget. You can think about investing in some quiet Noctua fans and/or an AIO and your build can be dead-silent.

I have a similar setup, with 15x 8TB SAS/SATA HDDs, 4x NVMe drives and it consumes about 180W idle.

Pick your favorite OS (the one you’re most comfortable with).
Configure all HDDs into a ZFS configuration. Boot off the 860 EVO.

Since you still didn’t hit the budget, consider picking one of the last Intel Optane drives ($550, 960GB). Partition it and configure as SLOG, special device, and if it suits your use case, L2ARC. This config nicely saturates 10G ethernet and will make your ZFS shares move.

1 Like

If you think Threadripper Pro or EPYC, don’t bother, your budget would need to increase by 3000 dollars, that would be for used parts. I am looking to replace my current home network lab with either Threadripper Pro or EPYC. Do I need a 128-core Network lab? No, but it is something I want. Some people collect cars; others collect paintings; I collect computers with huge core counts. The build @jode is all you need.

Thanks @jode & @Shadowbane!

I went down the TR/EPYC rabbit hole for a minute because they were the only CPUs that looked promising for ECC support. Then I came across this forum and realized that the Ryzen CPUs should work depending on the MB. I actually pulled the trigger today on a Ryzen 9 5950X and the ASUS Pro-WS-X570-ACE MB @jode recommended.

I still have a couple of things to select (GPU, chassis, cables, etc.), but I’m glad to finally have those nailed down. Thanks again!

1 Like

I have a build very simular to the one @colonelpanic will have the on!y difference is the motherboard I have is manufactured by Asrock. Well if you want or need to get the full benefit of ECC ram, then yes you need Threadripper Pro or EPYC. But then your budget will need to increase quite a bit. If you went with the Threadrippet Pro or EPYC, you replace your Dell R710 with an AMD version. If you hadn’t already purchased your motherboad, I would have suggested a sever motherboard for that Ryzen 5950x

@Shadowbane What board did you have in mind?

Always interested in ways to improve.

On this topic I found an interesting accessory to the MB: a 10GB NIC that fits into an m.2 slot. The Innodisk EGPL-T101. It has an Aquantia chip inside, and should be well supported under Linux.

The second m.2 slot on the ASUS Pro-WS-X570-ACE only electrically supports 2 lanes PCIe Gen4. I personally find it a waste to use it for NVMe SSDs as their performance will be PCIe lane limited.
On the other hand I always found the 1GB NICs insufficient for my use case. This product promises to elegantly solve both limitations.

It may not necessarily fit your use case, but wanted to share this discovery nonetheless.

I’m confused, don’t you need an SAS controller card to use a SAS expander? Isn’t it just SATA on that motherboard?

That’s correct. Op lists a LSI SAS2008 controller card in his possession.

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.