RAID 10 NVME issues

Hello Level One,

I meant to post here sooner as an enthusiast but now I am asking for help. Please redirect me if this is in the wrong area.

This is my setup.
https pcpartpicker.com/b/z3P323

Long story short, I had a working RAID 10 setup. I removed the PCIe expander while the PSU was off, removed a separate M.2 drive, then reinstalled the PCIe card with the PSU off but now my OS will not boot. In my UEFI settings it shows all four drives online and my RAID still active.

Is my data lost? All I really want is my data off my drive, but ideally i would be able to fix the raid. I am in over my head at this point and need assistance.

Thank you

Make sure that the motherboard is trying to boot the corect drive(s).

It should be trying to boot from the correct drive, as far as I know. I just tried again and disabled everything except that UEFI option. All I got after my typical UEFI screen was the 5 circling Windows dots until they froze and it restarted. When this issue first occurred I was getting past the dots and was able to try a windows boot diagnostic.

In the Raidxpert2 section of my UEFI I see three array options: array 1 is 0.0 KB, array 1 is 2.0 TB, and array 2 is 255 GB.
When I had a working setup I had a single nvme on the motherboard (original windows install) and the xpander held the 4 nvme in raid 10. I still had to select which Windows volume to boot from, even when booting in raid instead of ahci. Volume 4 was the original install on the nvme installed on the board while volume 5 was the raid 10 drive. Is it an issue that I removed the drive that contained volume 4?

Yeah you may need to put the removed drive back. I bet it has the real/main efi boot partition on it?

Could I put a sort of dummy drive in its place? I have since formatted that specific drive for a “work” laptop (I dj sometimes) and the exact drive in the same state is sort of not available. I have a different nvme I could put in its physical place? But it would be a Kingston replacing a Samsung and they are different sizes.

Hmm … You might have to install Windows on that SSD in another machine with no other drives active/installed then boot from it in the spot you removed the other drive from on the broken machine in order to examine the situation on the machine with the raid 10 array

To do anything else you run the risk of destroying the array of you haven’t already. Some of the config is in the uefi so I don’t want to mess too much with the machine that has the missing disk

Good news everyone!

I have gotten my raid setup working for now. I put the same physical drive in the same physical spot with an install of windows from my dj laptop. I wonder if I didn’t format the drive and the main efi settings were still there? My UEFI settings now say array 1 is populated with a drive and array 2 is still the raid 10 2.0 TB. Not really sure why I didn’t think of this before, it was an extremely simple solution.

Something interesting: when I boot I get the option for a volume 12 and a second volume with no title. If I select volume 12 it boots into the Windows that was on the lone nvme. If I select the unnamed volume; it will try to boot, restart, and gives no option for a volume but boots into the raid anyway. I’ve been able to recreate this a few times (after backing up information of course.)

2 Likes