So I bought a used Dell 5820 Tower workstation to use as a NAS. I’m just getting started, but it seems great, TrueNAS works well.
Problem is, I’m trying to use a 4x M.2 NVMe to PCIe add-in card, and I can’t get the dang thing to work! At least not in this computer. It shows 1 of 4 drives only.
And I’ve tried all the standard things ;_;
From googling, the 5820 should support PCIe bifurcation- many people have said they use it.
BIOS version 2.36.0 from May 10, 2024
I am using a PCIe x16 slot that is wired x16. This tower has two electrical x16 slots, and I have tried both, including the one nearest the CPU
This is a Xeon W-2125 processor. It supports bifurcation as far as I can tell, and I have plenty of PCIe lanes
I have VROC turned off, and I have no VROC key installed
VMD is off
There is no specific setting in my BIOS that I can find to specifically turn a PCIe slot into x4x4x4x4 or other configs, but as far as I can tell, I don’t need to for it to work
I have tested the card on another system with an ASUS TUF GAMING B650-PLUS WIFI motherboard, where after manually configuring the mobo in the BIOS to RAID mode (clearly it’s not literally RAID right?), I could see all 4 drives and use them individually in Windows. I could also do x8x8 and see two of the drives. So the card works, at least in that context
In the BIOS, under system info, under PCI information, it says the slot is populated by “Mass Storage,” which seems expected
I am stumped. I can boot into TrueNAS (different drive), and it can see the first drive just like the BIOS. But the BIOS just cannot see the other drives.
Does anyone have an idea? I’m down to do any testing and settings changing suggested! There’s gotta be some random thing I’m missing.
What do you base this on? Not saying you’re wrong, I’m just curious since I was under the impression that one always had to configure this in the UEFI.
I’m not sure actually ha. I’ve looked into this specific computer and previous/later models, and many people mention using add-in cards like this one. I haven’t been able to find pictures or exact instructions of how they did it though.
From my understanding, you turn off VROC and VMD and it just… works? Which surprised me too, but many people mention using these cards on this exact PC
If anyone with a similar issue finds this: there may be hope!
The drives I was using to test all this were four 16GB M.2 NVMe Intel Optane M10 sticks. They’re the cheap, slow model you can find on Ebay and the like- I wouldn’t generalize this situation to all Optane.
Throwing variables at the wall, I switched the stick in the first slot of my adapter card out with an old “normal” M.2 NVMe drive I have- 500GB with I think a Windows 10 installation on it.
IT WORKED! The “normal” drive and the remaining three M10 Optane drives showed.
Now if I swap the “normal” drive to slot 2 on the card, and leave an Optane drive in slot 1, it goes back to only showing the single Optane drive in slot 1.
Why this is, I have no idea. I may test further, and I may not. If I do I will report findings. If you see this in the future feel free to contact me, though there’s a good chance I won’t see it.
It may be something with the Optane sticks?? All being the same model, and a very quirky product, maybe they report in a way that, if one of them is the first to report, no more are searched for? It may be because the “normal” drive has a boot partition? It may be that the detection doesn’t work well when all the drives are the same model?
This thought was in the back of my mind from the start, but it seemed too dumb to work. This is a reminder to try all the dumb, easy solutions when troubleshooting! It reminds me of how taking 30 seconds to change your DNS server can magically fix network problems even when it makes no sense lol
I tried the add-in card with no drive in slot 1- this sent the computer into a perma-reboot loop. This computer does 3 reboots after every hardware change, that’s normal. But I sat through at least 10 reboots before pulling the plug.
I thought maybe bootloaders, partitions, or sectors might be the solution. So I installed Debian on another M10 drive, put it in the card in slot 2 with my TrueNAS installation on an M10 in slot 1. Did not work. Only saw the first drive.
My conclusion is this:
The boot problems are likely either due to this kind of very early version of Optane (the M10) being weird, or possibly relevant to more people- the M10 drives are B+M Key and PCIe gen 3.0x2. These are the only B+M Key M.2 drives I have, they’re the only 2 PCIe lane drives I have. That’s my guess.
So if you come here from Google for a similar reason, try putting a proper 4 lane NVMe drive in the first slot of your adapter card. Or try using an M Key drive. I’d be fascinated to know what the exact cause of this is, but ultimately the fix is simple: I have a $10 M.2 PCIe 3.0x4 M Key NVMe drive on the way from eBay as I type this
Sorry if this should have been an edit rather than another comment that bumps the post. I’m new to the forums and learning etiquette!