Possible future NAS build

Hello from a long time lurker,

I’m in the process of rebuilding my homelab machines and was looking for some advice. A bit of backstory;

Currently running an E3-1230V2 Xeon with 8x 12TB drives and Truenas core. Also have a Dell T5810 that I was using with Proxmox. The SSD died in the T5810 and I had already picked up an Epyc CPU/MOBO/RAM combo so it’s time to finally get that box up and running. Initially my plan was to virtualize Truenas and I’ll probably still test that at least but I guess you could say I’m part of that old school contingent that believes in separating hypervisor and NAS so I’ve been looking at upgrade options.

Long story short after some digging the (updated CPU) Ryzen 5 PRO 5650GE seems like it fits the bill with the addition of the iGPU so I can play with Jellyfin and I can keep ECC support without breaking the bank. Where I’m running into questions is with motherboards. As far as server grade motherboards it looks like the X470D4U is my best option but was wondering if there are any others? I know Asus supposedly works with ECC and can be about $100 cheaper but if I’m honest I haven’t had the best luck with Asus motherboards over the years and I’d rather have IPMI.

The short list is currently (PC part picker not much use here with the motherboard and ram)

AMD Ryzen 5 PRO 5650GE ~$170
AsRock Rack X470D4U $269.99

I already have 10gb SFP+ NIC, LSI HBA, a Fractal Define R5 to stuff it in and the drives would be moved over from the old system. Assuming that this setup will be fine for what I’m looking to do any RAM recommendations would be welcome. I’ve looked at the QVL but they only list 1 32gb module and I can’t actually find it listed anywhere for sale. 2x32 or 4x16 would be ideal but depending on price I’d be fine with 32GB total.

Anything I missed? Truenas might run a few things (currently running pi hole while the VM server is down) but once the Epyc server is up and running the only thing I think I’d run on it is jellyfin, everything else would be on the Proxmox server.

You need Pro CPUs for ECC support and integrated graphics

Good thing I asked, the way it’s listed on a website I checked made it seem like 5000 series supported ECC. So now I have a few 35w options I guess.

Why not go for AM5 instead?
I’m quite sure people here have verified that B650 boards also works with ECC and/or linked to such reports.

It’s been a while since I looked but DDR5 ECC was pretty much unobtanium I guess it was a few months ago. I’ll go take another look.

Here is a quick AM5 build example that supports ECC I just made in another thread, that might be of use:

PCPartPicker Part List

Type Item Price
CPU AMD Ryzen 5 7600 $218.42
Motherboard Asus TUF GAMING A620M-PLUS WIFI $149.99
Memory Patriot Viper Black 2x8 GB DDR5-5200 CL36 $54.99
Storage Samsung 980 250 GB M.2-2280 PCIe 3.0 $33.00
Case Fractal Design Meshify 2 $159.99
Power Supply Thermaltake Toughpower GX2 600W $59.99
Wired Network Adapter Intel X520-DA1 10 Gb/s $127.50
Custom ORICO M.2 to 6 x SATA 6Gbps RAID Adapter Card $49.99
Total $853.87

Ignore the parts you don’t need / want. Motherboard should be ECC capable, atleast the specs say so…? If that is the case the only thing you need to do is replace the RAM to ECC RAM (could only find twice as expensive RAM from a quick search though).

Not sure why you’re recommending a motherboard with very limited expandability, 8Gb sticks are a bad choice even if you only can affort one 16Gb stick for the time being. The SSD is kinda crap and why a tiny one? A 500Gb P5 Plus is ~50 USD and will run circles around it in all ways if you’re on a budget. If you’re on a very tight even the P3 Plus performs a lot better and is cheaper. Most M.2 to SATA adapters seems to be of questionable quality (mainly being on very thin PCBs so they flex a lot), Silverstone appears to have tried to mitigate this however I have no experience for the JMB585 controller so I can’t say how well it works. ECS07

Something like a Asus TUF GAMING X670E-PLUS despite the Realtek NIC it’s quite a solid option with good expandability, Asus ROG STRIX X670E-A comes with an Intel NIC instead but with a 1x PCIe slot which is kinda useless.

@badtz_maru
You can easily get stocks from Kingston and Crucial/Micron these days.

The research I’ve done so far makes A620 not seem very compelling. So far the boards that look interesting (GIGABYTE MC13-LE0, AsRock Rack B650D4U, Supermicro H13SAE-MF) are more than I wanted to spend but I’m still looking through the ASUS B650 and X670 boards.

Given my recent experience with Gigabyte and the “recent” trainwreck Gigabyte MW34SP0 Motherboard I’d highly consider avoiding Gigabyte.

ASRock boards looks pretty nice (probably much more expensive than its worth though unless you highly value IPMI) but the absense of BIOS updates would make me very cautious.

Don’t see the benefit with the Supermicro mobo at all unless you’re deadset on the brand.

The ASUS ProArt X670E-CREATOR is a nice board but might be a bit overkill depending on your requirements. The 8x/8x/2x arragement of PCIe slots is quite rare including PCIe support. Some MSI board have 8x/8x/4x but lacks ECC support.

The X670 proart board is overkill, the B650 proart looks interesting though. Keeps the 8x/8x/4x (gen 4 instead of 5) layout without all the features I don’t need. It’s also reasonable at $239.

I can live without IPMI considering the machine wouldn’t be living in a datacenter. Crucial also lists compatible ECC UDIMM’s and not only are they available but the price has dropped significantly from the last time I looked into DDR5 ECC pricing.

I’d lose some of the features and PCIE connectivity I’d get with Xeon/Epyc but get lower TDP, iGPU and realistically I don’t need to go ham with tuning ZFS on my home NAS. I’ll do some more digging but this might be a winner.

Yes, it may be a decent tradeoff but you probably want to add a NIC due to the Realtek NICs.

I already have a 10Gb SFP+ NIC, not a problem.

A little heads up, not sure how well AM5 fares on TrueNAS Core as they’re AFAIK still on FreeBSD 13.1 ( https://download.freenas.org/ ) possibly with some backports. From what I understand Scale is less reliable and also targets a rather old version of Debian/Linux kernel. Upcoming FreeBSD 14.0 (FreeBSD 14.0 Release Process | The FreeBSD Project) runs fine however, my box is currently on 14.0-ALPHA2 so it’s a bit behind last release ( New FreeBSD snapshots available: stable/14 (ALPHA4) (20230901 4c3f144478d4) ) but runs fine doing NAS stuff and a bunch of other things.

I fully intend to test before replacing my old NAS. I know a lot of people are moving to scale but I have a soft spot for FreeBSD + ZFS (and had all kinds of problems the last time I tried scale) and if it came to it could just manually do all the TrueNAS stuff in 14.

1 Like

A620 isn’t that bad of a deal for a SOHO NAS. The main difference is PCIe 4.0 instead of PCIe 5.0 (completely irrelevant for 10 GbE NIC and SATA drives, both which do not even saturate a PCIe 3.0 x4 slot), no overclocking support (you do not want to overclock a NAS and a 7600 is plenty powerful), no crossfire support (crossfire is dead anyway), fewer PCIe lanes (not a problem on mATX). And that Asus board even support ECC. So, for the use case I would say it fits, but naturally you can always pay more if you want to :slight_smile:

The best NAS option IMO would be an ITX board with one PCIe slot, two m.2 slots, 12-16 SATA ports and a dual 10 GbE NIC, but that is just waaaaay too specialized for most companies so :person_shrugging:

If I only needed to add an HBA or a NIC it would work but I need to add both and as far as I can tell none of the ASUS A620 boards break the PCIE out as 1 x16, 1 x4. Might be fine for other use cases but a non starter in mine.

I did include this one in the build, didn’t I?

Those adapters in my experience are notoriously unreliable. The best worst case is drives dropping offline when the adapter overheats when moving a lot of data, same issue I’ve seen with PCIE > SATA adapters. The best one I’ve personally seen is the Silverstone ECS07 which has a heat sink but I can’t find a good reason to spend $50 on that instead of on a better motherboard when I already have a HBA. If for some reason I specifically wanted to build something small and it wasn’t going to be that busy it might be worth it, but neither of those is true in my case.

1 Like

If you have a HBA that’s not ancient there’s little reason to switch unless it causes issues. The major benefits going for lets say a ASM1166 PCIe card would be AHCI (no need to worry about drivers and firmware) and cooling as they tend to run a lot cooler being more “dumb”/less capable.
Like this one… ECS06

ASRock Rack do provide BIOS updates, though?

Maybe you’re trying to point out their velocity of updates due to there not being one in recent months, which, sure, they don’t really publish updates frequently (which I would expect is desired) and have been slow in releasing updates for new AGESA updates/etc., but they eventually do…