An SSD suitable for handling multiple small files at once?

hello-

I use a program called UTAU that generates a vocal track by reading wave files, finding it's settings in a config file and then time stretching and pitch shifting it on the fly.

This creates a CRAP TON or very small "cache" files and it ends up getting bogged down by my harddisk.

This app is native single core so it only makes on file at a time but it is still quick enough to get held back.

This issue is further increased as I use a tool that makes it multi-core capable and it REALLY gets bogged down by this as each core in the computer to generate the cache files.

So with each cache file it has to 

-read the source wav file

-read the configuration txt file

-modify

-write the small cache file

Over and over.

and while my current computer is only a quad core, my next build will be an 8 core xeon (or dual socket) build as I need it for other uses.

So I could have 8 of these instances going of at once...

 

For an SSD I am looking at an ADATA 256GB SX900- but I am unsure if it would be suitable.

What should I be looking for in an SSD regarding spacs

and do you have any you would recommend?

I think I need higher IOPS, but am unsure high is necessary.

I think a Samsung 840 pro would be the best choice. Very fast and very reliable. Its made for hardcore professional use.  

I was also looking at those-

is the $75 price diff worth the performance? I am looking at the 256GB version.

EDIT: the following is a separate question- The files are saved in a folder and this is NOT harddisk cache related-

--also, do you know of a smaller (~64GB) ssd with similar performance?- I want to get one for hdd cache and 128GB seems too much.

How much data is being stored? you could use a RAM cache if the files will fit: http://www.romexsoftware.com/en-us/primo-cache/

well...let me think...

 

The cache will range depending on the size of the project- 25MB per project seems about right...

I have these set to delete once closed but I HAVE set them to save in the past only to realize the cache totaled over 7GB.

This isn't harddrive caching in that sense, that was a separate add-on question, the files are saved to a folder (unless I'm not fully grasping what you meant)

I could create a ramdisk I suppose and set that as the caching folder but even then the source files will still need to be read constantly and my hardisk can't keep up with the random writes.

 

since the source files have to be re-read constantly with every task I assume I need an SSD with good IOPS anyways- is this correct?

Use the Cache, not the RamDISK. The Application is free to try.  Let me explain how it works.

  • First I suggest the server edition since you can aggregate all Hardrives into one cache: Primo_Server_Setup
  • Next pick a RAM amount that leaves you with enough system space to not cause issues.  Since I have 32GB and I'm not using it for a specific application I run 15GB cache.  I can run higher but choose not too as some applications i use do require me to shut it down since they require 10gb+ ram. SETUP
  • I set a Deferred-Write of 30 seconds.  I choose this because write operations are allowed to accumulate for a period of time and are then flushed to disk all at once, reducing the overall number of disk I/O operations.  I'm not worried about mission critical writes missing from issues such as a blue screen.
  • Additional information: Primo Performance Terms & PrimoCache Configuration Terms

I/O Results.  I ran some 10 minute IO tests from IOMeter to show you the difference caching makes on SSDs and HDDs.  My current setup is two SSDs in Raid0 (C:) and an HDD (F:)

Test setup: 8KB writes across 10 workers

I also increased the write size to 64KB.  I/Os go down but the amount written per second increases. 

The reason the writes are so high is due to the write-defered cleaning out multiple writes to the same location.  If you look at the screen shots an view the "normal write", this is total size in bytes of data written to the disk from the cache by reason of timer flush or manual flush operations.

okay- I set it up, rebooted, and such-

 

I have to play around to make it work for some reason...which was a bit odd...

Sense the files have to be read first the initial render still takes quite a while-

After that it will render fast but then have along pause before it will actually write the rendered file to preview. During that time Primo shows a constant hit rate or ~17% 

 

For other thing this would be a great solution, but for my use it doesn't seem to work very well.

I think I will go for an SSD for my main OS drive and either use this program as a cache solution for my secondary HDDs or go for physical cache with a second SSD.

 

Would you be able to suggest a fast SDD that Icould use for physical cache? my main I'm going to get a 256GB 840 Pro but for the cache I wanted to get something smaller than 128GB...maybe 64GB which the 840 isn't available in-

 

I think I may be able to use Primo for my secondary drives but not on my main drive as renders still take very long-