New to NVME

Hello all,

I have recently been playing with the idea of setting up a raid 5 equivilant system made of m.2 NVME sticks in my desktop system using one of those cheap pcie cards. There are some things that I am confused about and I would like your guys’ input. Maybe perhaps the product listings dont acually make sense like the ebay battery market.

My system is build on the Gigabyte 990FXA UD7 motherboard. I have to R9 290x installed and 32GB of ram, running Debian Sid non-free.

I want to setup my array with perhaps 4 of these cheap NVME sticks. 256 or 512GB each. My system only supports PCIE 2.0, so would it even be recommended that I go this route since the gap between SATA 3 and these would be much less. This would be primarily for my home directory therefore fairly large files so IOPS is less of an issue.

These adapter cards are very confusing. It is my assumption that an m.2 NVME takes 4 lanes. Some of these cards have 4 lanes on the connector but have two slots where cards can be added. And there are full length x16 adapter cards that only support one card. I also cannot find an x16 adapter card that supports 4 NVME sticks. Is there an adapter card that this forum recommends?

1 Like

It sounds like you might be better off using some SATA SSDs to keep cost down. All of the less expensive quad M.2 cards will require your motherboard to support PCIe bifurcation. Otherwise only the firt connected drive will be visible to your system. You can get a card with a PCIe switch chip on if that will let you use 4 drives on one slot, but you will be spending $250+ on just that card.
I use this Adwits card with a switch chip on my x99 system and it works great without the mobo support bifurcation.
https://www.amazon.com/gp/product/B08348376V
The downside is that it cost as much as one of the drives I’m running in it.
Electrically its only 8x so on your board with PCIe 2.0 the max theoretical speed will be about 3300 MB/s.

1 Like

Just be aware that not all motherboards of that era (2011-ish) can even boot via NVMe.

I purchased a SilverStone ECM21 m.2 to PCIe x4 adapter in 2017 and stuck it in my Asus Maximus IV Extreme-Z just as a bit of an experiment. The installed m.2 NVMe SSD was recognised, and I was — eventually — able to get Ubuntu installed, but nothing I tried could get it to boot.

A variety of performance tests showed that the card was working fine (and has been working fine ever since in a different system), so it wasn’t the card’s fault. The motherboard simply did not support booting over NVMe.

I later discovered that there is custom (third-party) firmware out there that will allow my motherboard to boot from a NVMe drive, but I haven’t pursued that.

Had I purchased a m.2 AHCI SSD then I believe I would have been able to boot without issue. The AHCI driver is the same one that gets used for regular SATA drives.

tl;dr: Not all motherboards of that age have firmware drivers that support booting from NVMe. Using m.2 AHCI SSDs would probably overcome that problem. If you’re not booting from the drive, and are just using it for storage, then you should be fine. The SilverStone product I purchased has been performant and trouble-free for around four years.

1 Like

Thank you for your reply. Yeah my motherboard does not support bifurcation. I do have 2 pcie x8 and a pcie x4 so I could run 3 separately on the less expensive adapters. Since I have 5 unused sata 2 I think I will buy more samsung 850 evos and make an array from those.

Here is the type of adapter card you are looking for. Note that your motherboard MUST support 4x4x4x4 PCIe bifurcation in order for it to see more than the first device. This is also assuming that your board supports nvme booting: https://www.amazon.com/gp/product/B084HMHGSP

The cards that have two slots are usually only ever meant to have 1 drive in it at a time. They seem to typically allow you to use either an NVME m.2 drive, or a SATA m.2 drive.

That said, please reconsider your planned setup. Rather than have the extra price of an adapter card and the potential failure rate of multiple nvme drives, just buy 1 1-2TB drive, and a larger sata ssd or hdd for constant incremental backup. Keep in mind that for typical consumer use, SATA and NVMe SSD’s write endurance is going to outlast the system it’s in, even if it was new. I still have an old 60GB MSATA drive that works perfectly. It’s much easier to reuse a larger drive than a bunch of small ones.

Also while it’s fairly cheap and trivial to turn add more sata ports via an HBA card, PCIe lanes are very limited in everything except prosumer threadripper and server gear, and NVMe drives consume 4 lanes per drive. Rather than burn a x16 slot on your future motherboard, keep it one or two drives and just put them in the onboard slots.

And if you really want device redundancy, go with mirrors to keep your IOPs, not with raid5.

Don’t bother with the EVO’s. Unless you use them for the C drive the Magician software can’t do its tricks and you lose a lot of performance and managability. Just stick with standard SSD drives.

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.