Making a 10 NVMe drive RAID array?

Ahoy!

I recently spotted an intriguing deal on 10x used, ex OEM crucial NVMe drives. At 500Gb ea, that’s 5TB of SSD storage, and at the cost they’re asking for alone, it’s a tempting concept. Combine that with the performance of a 10-strong RAID array, and it’s very tempting indeed.

However, I have no idea how this may be implemented best. I only have a single x16 slot (running at x4 max) and 3 x1 slots - all 3.0.

I’ve seen x16 slot pcie expander boards with 4 nvme bays, and x1 slot boards with dual nvme bays. That totals 10, but I’m unsure what specifically to look for, or if there’s something I’m missing in the first place. Would appreciate any advice!

1 Like

I’d get a better motherboard and buy pci cards for nvme.

https://www.asrock.com/mb/spec/product.asp?Model=HYPER%20QUAD%20M.2%20CARD

Just beware, cards like these are effectively limited on consumer boards, just on per system if are not using gpu.

PCIE slot must support x4x4x4x4 bifurcation to use all slots on card.

If workstation grade boards are in play, then you can use multiple without much issues.

Case study on threadripper pro platform from few years back:

Alternative is using AIC card with pcie switch, but these costs hundreds of usd alone each.

Have fun.

What kind of workload are you planning to run on it? This will have a large influence on what RAID setup you should use. Plus you’ll need some serious CPU to push them all at 100%.

Let’s say you got 10x500GB for $100. With redundancy, you can get 4tb usable space from it. To make it work, you spend another $100 on the adapters, and the solution is kind of dodgy.
For me, you can just buy a single 4TB ssd for $250 at current market.

Anyway, to answer your question, you just don’t have that many pcie lines on mainstream motherboard. Alternatively, you can use usb3.0 to nvme adapters.

1 Like

You are better off sanity and money-wise just to pick up one of the various eBay enterprise 7.68TB NVMe drives (and put a fan on it). These are sub $500, and sometimes reaching $400 or below.

It’ll give you all the IOPs you’ll never be able to use, all day long, and you’ll only need to worry about 4 lanes rather than 40, which would require threadripper, epyc or some Xeon (?) board with multiple (bifurcation required) adapter cards, or an expensive PCIe switch + riser cards.

Unfortunately, like with small cheap HDDs, small cheap SSDs just aren’t really worth the price of the slots they require to connect.

I would use this as an excuse to invest in a Flashstor 12 Pro:

It’s a good way to get started in the NVMe NAS space. If you do require local storage though, your one stop shopping need is the Apex Storage:

https://www.apexstoragedesign.com/apexstoragex21