8+ TB SSD recommendations

Is the Micron 9400 Pro/Max a big upgrade from the 9300 Pro/Max on the 8-16 Tb SSDs?

whether it’s worth it depends on price you can get xD it’s pci-e gen 3 (at best 4 GB/s) vs pci-e gen 4 (at bet 8 GB/s in perfect world). For sequential i/o it would make quite a difference but since you’re working on rather small files it’ll probably have smaller impact than one could expect. If price difference isn’t very big then it may be worth to go for 9400 but if it’s like 2x more expensive I wouldn’t bother. And for sure I wouldn’t bother upgrading if you already have 9300. But that’s just my opinion, based on my attitude towards workstation budget.

2 Likes

I appreciate your responses. You mentioned using software to copy files in parallel. Can you point me where I can go to learn this? I watched several videos that explained what is happening, but nothing I could find would show me how to do it, or the software to accomplish it.

2 Likes

I’m not really a windows person so I’m afraid I won’t be able to help much in this area however @MazeFrame mentioned robocopy and quick search suggests that it indeed may come handy. For Linux I found something about rclone which seems to be some variation of rsync that supports multithread file copying but I yet have to verify whether it works as expected.

1 Like

With the regular Windows Explorer (the normal windows with your files shown), it copies one file at a time.
That is fine for a hand full of larger files, but gets very slow for hundreds or thousands of small files.

One inbuilt tool in Microsoft Windows and Server is robocopy. documentation here
With Robocopy, you can specify a lot of things.
For example, if you want a 1:1 clone of your production-data replacing the oldest backup in rotation but do not care for empty folders, you would run this command in Powershell:

robocopy D:\Production X:\ExternalBackup /s /b /mir /log: X:\copy1.txt

Command breakdown:

  • D:\Production - Source Directory, files are copied from here
  • X:\ExternalBackup - Target Directory, files are copied to this directory
  • /s - includes non-empty sub-directories
  • /b - Backup-mode, resets file permissions so the admin does not get denied access to the copies made
  • /mir - mirrors the source onto the target, removing files no longer present (Don’t do this for backups!)
  • /log:… - logs the process to a file for manual checking, debugging or log-keeping

This is still slow, almost equivalent to drag&drop copying using Explorer. If the thing crashes or looses connection in the middle of a large file, it has to redo all that work. And the log only shows files it copied, not all the files it touched.


robocopy D:\Production X:\ExternalBackup /e /b /z /MT:16 /log: X:\copy2.txt /v

Command breakdown:

  • D:\Production - Source Directory, files are copied from here
  • X:\ExternalBackup - Target Directory, files are copied to this directory
  • /e - includes empty sub-directories
  • /b - Backup-mode, resets file permissions so the admin does not get denied access to the copies made
  • /z - Copying can be restarted mid-file (useful when handling large files over mediocre networks)
  • /MT:16 - 16-way multithreading
  • /log:… - logs the process to a file for manual checking, debugging or log-keeping
  • /v - verbose logging

Notice:

  1. The existing files on the target are no longer removed, instead, as any good backup does, we keep them just in case
  2. /MT will make things faster until the machines on both ends, or the network in between can no longer keep up.
  3. All log files are worthless when nobody reads them
  4. This is not “backup backup”, this is a “copy”. Proper backups have versioning, contain everything and audit themselves through checksums and other safety mechanisms. I recommend reading through this: http://taobackup.com

6 Likes

Mind Blown! There goes my weekend, I will let you know how far I get! Thanks again