Hi,
I’ve ordered all the Hardware I need for ML inference and basic training to switch from DevOPS to MLOPS.
My rig will have this hardware:
CPU: 9950X3D
MB: ASUS X870E ProArt
RAM: 192GB (4*48GB 6000 Kingston Non-ECC). I will work on configuration to run the RAM at 6000 with 2R on 2DPC
NVME: 3*4TB WD SN850X
GPU: 2*pcs 9070 XT Red Devil
My questions:
-
What is the best COW file system for NVME Strip Configuration? I will use strip configuration because there will not be any data that I can’t redownload from my NAS. Later, I want to change the configuration using HDD + NVME cache locally because this will be faster than a 10Gb network. Could you provide a link or example for the filesystem configuration for NVME?
-
What is the best Distro to run all AI containers if I have more experience with Debian and RedHat, but they are not on edge with kernel and packages? I do understand that I can install the kernel from the mainline, but I prefer to switch to some Distro that, even from experimental or dev repositories, will properly work with 9070 XT.
Thank you for your time and advice.
Have a nice day!