Is it possible to build a 4TB SSD Raid+ program backup externally to another drive?

Hey All!

I’ve been editing 1080p,2K, now 4K and up videos all on external SSD’s for long. I’m now hitting a point where one single project - all the raw media, exports, renders, effects etc fills up a 2TB Samsung T5 SSD - which is problematic

I’ve been reading and watching some L1 tips on raids - and my takeaways and hopes for a new storage solution are this:

In my mind - I’d like to get (4) NVME M.2 Samsung SSDS - and configure them in an external bay to act as a 4TB SSD Raid - but configure the raid as Raid 0 for speed and daisy chain a backup drive - which would be 4TB HDD and backup the SSD editing drive every 10-30minutes or something.

Maybe the 4TB SSD Raid is possible with those internal drives or maybe I’m missing the obvious - but even still - raid 0 being as risky as it is, my idea to tackle that problem was the daisy chain of an external backup - but I’m not sure if those two would play nice

I’ve mainly seen people talk about using them internally in a PC Build only and at best combining 2 Nvme’s to have 2TB of SSD - but that’s not enough for me as I’ve stated above

Again, I recognize I could be completely wrong in my thinking of this - but combining those SSD’s seemed like the best route rather than HDD Raid Setups (The obvious cheap mas storage solution)

Appreciate any help thank you

The issue seems to be some external enclosures like from OWC - Bottleneck the NVME SSds at 700mb/s beucase they’re only using 1 Pcie express port and dividing that over the 4 NVMe drives - making it barely faster than a USB 3 connection. So if theres a bay where they dont bottle neck the SSD’s - that would be the move.

I’d recommend 4x 2TB NVMe in a RAID6. Still 4TB capacity, but a redundancy of 2. (i.e. you can loose 2 drives and still retain your data). Super fast as it’s NVMe, despite the performance hit of RAID6, but perhaps over your budget. These NVMe drives are mounted to a PCIe 16x card, but that requires your mainboard to support bifurcation of that particular PCIe slot. Most modern mainboards do, but not all and older systems most likely won’t support this feature. And your OS needs to support it, I understand Win10 has problems with such a setup. Linux will support it, MacOS most likely too (but I’m not a Mac user!)

Now, your planned backup schedule is gonna be a problem. HDDs are limited in bandwidth, as they can only process at the max rate of the SATA port of 500-550MB/s. Being mechanical devices, often they won’t reach that limit at all. Realistically, 170-ish MB/s is a decent average. Copying 2TB (=2 million MB) of data will cost you: 2,000,000/170=11,765s. Which is about 3.5 hrs. So, restarting a backup every half hr isn’t making any sense, never mind a 10 min. interval.

As mentioned before, NVMe is quite a bit faster: PCIe 3 offers max 3.5GB/s, PCIe 4 doubles that. But that introduces another bottleneck: your network. Copying that amount of data over a network requires at least 10Gbit/s, better 25 or even 40 Gbit/s. Note the difference between bit and byte! Theoretical it’s a factor of 8 (1 byte=8bits) but in practice, taking in account of extra overhead, a 1:10 ratio is more suitable. So, 10Gbit/s ~ 1 GB/s.

HTH!

For windows, there’s StableBit DrivePool.

https://stablebit.com/DrivePool/Features

Thanks for the response! Very helpful!
I didn’t even think 2TB NVME’s were market available (which is why I first stated 4 x 1TB SSD) but I found them on B&H!

“Intel 2TB 660P NVMe M.2 Internal SSD”

4 of Those runs about $900 - I can work with that, just curious once the how cost adds up with other pieces once all said and done.

Yes I now realize that after your help haha thank you! I wrote that intially thinking that if I had to run the RAID 0 (instead of Raid 6) method to maximize speed but loose redundancy so I thought maybe a backup of the raid 0 external could happen often, for safety of the data (important for me when working on editing a project) - but I see how that’s impractical by far.

Yes Raid 6 with 4x2 definitely seems like the way to go . 4TB of workable SSD in raid 6 is my desired route.

Finding the Mobo with the right ports seems to be tricky.

Normal Mobos I’ve found thus far don’t typically offer more than 2 Internal SSD’ slots to begin with - so I need at least (2) internal ssds in the mobo to start and then at least (2) more PCIe 4.0 Slots or (2) PCI 4.0x 16x

Or did you mean a total (4) PCIe 4.0 Slots on top of the motherboard? or 4.0 x 16(x4 Mode) ?

I’ve found this one but maybe this is overkill for a custom raid?
" MSI PRESTIGE X570 CREATION AM4 E-ATX PRESTIGE X570 " from B&H

My current internet connection speed is 400mb/s at its best - and I’m not looking to utilize network functionality or multiple users and I don’t plan on using any of this for PLEX or or streaming.

So could I avoid the network bottleneck by not running / using as a NAS but rather a DAS type Raid?

Connected in theory to my editing computer (Z490) i9-10700K/ 64GB/ 2080TI - via thunderbolt 3 cable?
Or would the thunderbolt cable be the bottleneck connecting to my editing PC?

Thanks again!

Ok! Does StableBit DrivePool fix the issue with supporting bifurication of the PCIe Slot that Dutch_Master above mentioned?

I’m running Win10Pro and I’ve heard the Operating system needs to support it to make soemthign like this NVME 8TB (4TB Raid 6) setup work - with all these PCIe Slots in use

What I was aiming at is a separate SSD for the OS (Linux, FreeNAS, UnRAID) and then the 4 NVMe drives mounted to an adapter in the PCIe slot, like the ASUS HYPER M.2 X16 CARD V2 controller. As for choosing the main board, you’ll need to read the manual, or at least the spec sheet, to see if at least one PCIe slot allows for 4x4x4x4 bifurcation. Of course, with the entire RAID mounted to a single card, that leaves you any M.2 slots on the main board free to use for an NVMe based OS drive. For the OS, you can get away with a 500, 256 or even just 128GB drive.

PS: there is an 8TB NVMe drive available on the consumer market, but that’ll cost you a fortune: $1500+ for sure.

That’s a hardware issue with your motherboard.

TL;DR

Drivepool is just like a plugin for Windows that makes a pool of all your logical drives that Windows can see.

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.