I have quite a few PowerEdge servers ranging from 12th to 15th gen and have often used single M.2 NVMe to PCIe adapters with zero issues, but the venture of using dual NVMes on a single PCIe adapter with bifurcation is seeming like a fruitless endeavor. Not even trying to use them as boot devices… just storage. Primarily as SLOG and special storage for TrueNAS. This is using two Samsung 1.9TB PM983 22110 NVMes.
I’ve so far purchased three different quad adapters with two being the “switching” type and none have worked. I have tried each of the PCIe slots in the server and always get the same result;
“UEFI0067: A PCIe link training failure is observed in SlotX and the link is disabled.”
I have tried with every BIOS setting; Platform Default Bifurcation, Automatic Discovery of Bifurcation, and Manual Bifurcation Control (x4 or x8), but always get the same “UEFI0067” fault during boot. The BIOS. IDRAC and all firmware are the latest and the system has otherwise been rock solid.
Has anyone been able to successfully use dual NVMes (non-SATA) on a single PCIe adapter in an R740(xd)? What adapter did you use?
Hello,
I got R740 and there is no need for “switching” type adapters. I have used Dell Ultra-Speed Drive Duo, Supermicro AOC-SLG3-2M2 and Asus Hyper m.2 X16 V2 cards successfully. Dell and Supermicro are dual m.2, and Asus is a quad card. No need to mess with BIOS settings, PCIE slots bifurcate automatically. Obviously you’d want dual adapters in x8 slots and quads in x16. It truly is “plug-and-play”, and bootable as well.
Thank you! That’s what I thought, but even the “non-switching” card didn’t work. That no drives ever showing up despite whatever BIOS settings for bifurcation.
I’ll definitely give the AOC-SLG3-2M2 a shot. Thank you again!
Ok, so I got the Supermicro AOC-SLG3-2M2, added the two Samsung 1.92TB PM983 SSDs to it, installed it in slot #4, set the BIOS bifurcation to ‘Platform Default Bifurcation’, but still continually get the following fault at boot…
I’m at a loss with what to do. If I take any of my other numerous single-M.2 PCIe adapters and put it any slot, it recognizes the SSD fine.
“UEFI0067: A PCIe link training failure is observed in Slot(X) and the link is disabled.
Do one of the following: 1) Turn off the input power to the system and turn on again. 2) Update the PCIe device firmware. If the issue persists, contact your service provider.
Available options:
F1 to Continue and Retry Boot Order
F2 for System Setup (BIOS)
F10 for Lifecycle Controller
Enable/Configure iDRAC
Update Server Firmware
Help Install an Operating System
F11 for Boot Manager
Disregard. I think I’ve discovered that somehow both drives have suddenly died at the same time. I took a known good x4 PCIe adapter that only supports drives smaller than the 22110 form factor, but made it hold for testing. One of the drives generates no error, but is not recognized at all. The other drive is the one that’s generating the UEFI0067 fault.
Thank you for your assistance. The last thing I could have anticipated is -both- drives dying.