Need Help with NVME Raid on x470 Crosshair VII

Set everything to pure stock and restest then tweak from there? Except memory. Start with memory then modify from there. Try to attan 2933 to 3200 stable but you need to test at each memory speed and timing set because you can regress. Push the oc too much and I/o suffers.

Understood. I’ll try what you’ve suggested. This gives me a direction to take things. Thank you. @wendell

The best thing you can do for processor oc is just to enable pbo, ditch the power and thermal limits and just let the cpu do it’s thing. It’s 4ghz for me all core on a 2700x and 4.350 on 1-2 cores. On this motherboard. With a fsb of 100.5mhz. and that config has offered the best overall performance vs any manual or at all. 3200 with 14-14-14 timings is solid but 2933 is really close too so if you test you may find 2933 is faster than 3200 depending on timings and other factors.

2 Likes

wendell is probably correct about I/O throughput being handicapped by manual oc settings. You are correct about what I meant with the 1gb vs 2gb test size. Worked for me.

1 Like

Sorry to sound dumb, but PBO isn’t a feature available yet in Ryzen Master. Is there a feature in BIOS that is PBO but just labeled differently?

yep, its a setting and the help text is like “load board defaults for X, Y and Z” (enabled) and auto is “amd defaults”
set that and everything else on not-really-overclocked settings.
memory at 2666/2933 should be fine too, to start as a baseline. Then bump memory speed to 3200 and play with timings. Then after that’s dialed in and you’ve tested raid speed in those scenarios can you tweak further

1 Like

Well, still no luck… I’ve tried everything… including: disabling all other storage ports, disabling most USB ports/headers, messed with as many BIOS settings as I could find… read forum threads, watched your video many times, watched many threadripper videos to gain insight, tried many more things. Tried the RAIDable way… tried installing standalone drivers without the rccfg driver files during front-end install… tried ASUS raid driver sources, tried AMD RAID driver sources, tried all the Threadripper drivers I could find on the net! I even went as far back as two previous bios revisions (601,702)… went back to current bios version 804 and slowly over clocked seperately the memory, then put that back at stock and slowly over clocked the CPU… then did both memory and CPU over clocking… Still no change!

I even tried installing Windows 10, non-RAID on just one Samsung 970 Pro NVMe 1TB drive, in the M.2_1 space and ran CrystalDiskMark there to only see a very under performing sequential read/write again! That got me thinking…

I went in and installed Samsung Magician and found out that the drive was not recognizable by the software. Also, Samsung Magician reported that it was NVMe PCIe3x2… the other drive was still installed in M.2_2 and was showing up as NVMe PCIe3x4 as I would expect. They are both 970 Pros… So that’s when I disabled as many ports, headers, and features that could think of… still no change after trying RAID0 again… The best I can do is 3600 seq. read and 3300 seq. write. No where near your results, Wendell.

I will try to swap out the drives into the opposite M.2 positions and see if I can isolate the drive or the PCIe lanes

I was curious though, where did you find BIOS version 508 as seen in your tutorial video? I can only find 509 on the ASUS website…

Newest bios works fine. Other users were tripped up by the csm settings and got it working pretty easily. It’s not that crazy, tbh.

If it’s 3.0 x2 pcie make sure you don’t have any SATA devices plugged in? There are some ports that share bandwidth ? It works for you but is just slow right?

Yeah, I agree with you for the most part. The bios works 99.99% of the time for nearly everything. With the exception of one thing, good Sir.

Noticed that most everyone else having success are using 960 pros. Even your example shows how it can be done well with 960 Pros. Which, I have to take a moment to express deep appreciations for your work and the tutorials you provide. You are a true gem in the tech world.

I’ve already ironed out the CSM conflict potential way early on in the process before all my parts arrived… hahaha.

I unplugged all SATA devcies before and turned off all the SATA ports. I believe SATA ports 1 and 2 share one of the m.2 lanes. But as we know other PCIe3 lanes are shared with the m.2_2 port. And there in lies the problem, as I believe it to be at this stage.

Woke up this morning, put on my ‘chaos ensues’ shirt and said, “What is the core issue? Is this a matter of firmware recognition and configuration by the BIOS? Will we ever get off this burning planet?!?!?”

What I’m starting to think is the case is that the AMI (American Megatrends Inc.) Native NVMe Driver Support is not meshing well with the 970 Pro. This suspicion is confirmed by the latest QVL.

Usually I ignore the QVLs and just use none QVL stuff on my own, find work-arounds and get it going. THIS has been the most challenging one yet. It’s such a new product that a new bios covering native drive implementation hasn’t come out yet. Which would also explain why I have yet to see the “Advanced>NVMe Configuration” menu item in the BIOS show up. Additionally, this incompatibility is confirmed by when I install windows on one 970 Pro, NO RAID, then later add in the other 970 pro and then install the Samsung drivers, go back to Samsung Magician and voila! The drive is so new that not even Samsung Magician recognizes the drive. Apparently there is a version 5.3.0 of Samsung Magician out there in the wild that hasn’t been officially released to the public which correctly identifies the drives as 970 Pros. Anyways…

The drive that was showing PCIe3x2 is now showing PCIe3x4!!! Eureka!

Right?

not so fast…
What happens is that NVMe drive gets firmware instructions via operating system drivers. When those get installed in the OS environment, the drive stores that in the firmware and it knows how to behave towards the BIOS. So the issue is getting the drives into a X4 state, then coming back into the BIOS for RAID0 based Windows 10 install and not losing that X4 configuration. I believe I have to install the graphics card in either the PCIe3_2 or the PCIe3_4 slot in order to trick the drives into keeping there X4 designation within the BIOS native driver. That’s very hard to do because when I setup the RAID0, the BIOS warns that creating the array will erase all data on the drive. I’m assuming this includes the firmware instructions of how to be configured as a PCIe NVMe SSD.

Only chronicling all this for the thread so others trying to RAID0 Windows 10 on the 970s Pros attempting to get the bleeding edge of speed aren’t caught off guard and think that their board or the drives are defective, thereby compelling them to RMA unnecessarily.

I’ll try different graphics cards slot configurations and get back to you. fingers crossed.

Huzzah!!!

…it worked :smiley::smiley::smiley:

I had to first install windows on a single 970 Pro drive. Then boot down, insert the other NVMe drive into the remaining M.2 slot. Then boot into windows and install the Samsung 970 Pro NVMe drivers to both drives. This loads the firmware that the NVMe’s need to talk properly with the UEFI BIOS.

Then I powered down, moved the graphics card to the PCIEX4_3 (bottom slot), booted back into windows just to confirm nothing had changed on the side of the NVMe drives and then powered down. Booted into UEFI in advanced mode, setup the UEFI for RAID based install, and built the raid array using RAIDXpert, as demonstrated in your tutorial.

Windows install went as one would expect, loaded the latest NVMe RAID (9.2.0.70) drivers during windows install. Several reboots later back into windows, loaded CrystalDiskMark and BOOM!

NVMe%20RAID0%20with%20only%20chipset%20drivers%20installed

This is before installing all the drivers for the rest of the platform. The only drivers I installed were the ASUS support website x470 chipset drivers (things like power management and such).

Thanks for the help man! Hope this tutorial helps other 970 Pro users on this platform.

as an aside, here’s my build specs

ASUS ROG Crosshair VII Hero Wi-Fi
BIOS revision 804
AMD Ryzen 7 2700X
G. Skill FlareX DDR4 3200 CL 14-14-14-34 (F4-3200C14Q-32GFX)
AMD Radeon RX580
2 Samsung 970 PRO NVMe M.2 in RAID0
Corsair HX850i
Lian Li test bench
Windows 10 Pro 64 (build 1803 or also called 17134.112)
No overclock… Yet

WOAHHHHH. 7k mb/s! That is very impressive.

I wanna get that sorta speed. But don’t wanna raid.

Great job on figuring it out bro.

1 Like

All inspired and thanks to you, @wendell !
I am pretty proud too.
I didn’t want to lose out on the investment I made with this platform

My intention with the computer is to game and casual web use, but with capabilities and throughput.

Next I plan on building a FreeNAS or unRAID system to hold my data stuff. That’s why I don’t mind having this system in RAID0. But I could see why someone would prefer having their only rig being in RAID1. (cough cough… iMac Pro)

You might find this perspective interesting… I’m upgrading my life from an FX8350 Hard disk drive based platform from 2012… hahahah :sweat_smile:

1 Like

Lol. I was gonna say you got me mixed up with Wendell

1 Like

Hi,

Greetings from Hungary. Love your videos, subbed since you’ve left Tek Syndicate. Keep up doing it, you are great.

I am facing difficulties to set up my nvme raid 0 on X399 Strix. Sorry to write here, but in principal it is the same like at X470.

So, in bios I’ve set up my array, it is the only one with 2 drives. Seems all ok, but when I try to install windows I can not see the drives as one partition after loading the 3 nvme drivers, just the two partitions separately. What is wrong?

I also have 2 sata drives hooked up, can this cause the issue?

Thanks.

@wendell Tried with the sata drives unplugged, same result :frowning:

I’m having similar problems to @gyik, Windows refuses to see the RAID array I’ve created from 2x NVME drives. It can see every other drive, just not those.

Hey so I figured out the issue (kind of) on my side. For some reason it’s down to the 1809 version of Windows 10. The setup for that ISO just does not see the array or crashes when loading the driver. The 1803 version works fine.

Weirdly if I load 1809 onto a different (non-raid) partition then Windows boots and sees the array just fine.

I was also thinking about to try with a different Win version, just I was too lazy to do that. Now I will try it on the weekend. Tthanks! :slight_smile:

No success. Win 1803 installer now sees the raid 1 array as one partition, but at the first boot after the installation the system hangs. After restart a message says that there was some error during the installation, therefore the installation is not complete. Please reboot and reinstall windows.
:frowning:

Does anyone have input on what the Read Cache Policy and Write Cache Policy settings should be?