CT and Radiography Workstation Build Advice

Hey y’all I am looking to build a workstation for the purposes of doing CT reconstruction, image processing, and Monte-Carlo simulations (Monte Carlo N-Particle Code) with a bit of gaming on the side.

I deal with lots of RAW images and process them with the RTK open source reconstruction tools along with TomoPy.

The only thing I know for sure is that I need an NVIDIA GPU to take advantage of CUDA acceleration.

Any suggestions for platform? Ryzen or Threadripper or Intel? Types of Storage options (NVME vs PCIE)?


I don’t have super specific advice, but I do cryo-EM tomography reconstructions and can give you my general experience. I want to point out though - I don’t use TomoPy, I use eTomo, and I don’t use lots of MC sims, except for maybe water placement, although I use similarly performing algorithms like Glowurm swarm and STRA6.

My experience is that most consumer grade CPUs have the computational horsepower to do any sort of calculations very quickly, and that the limiting factor is almost always read speed (and possibly memory). Because the tomograms I generate are usually 5+ GB, opening one can take 1+ minutes off of a SATA SSD. If I had the budget, I’d go for the fastest possible storage. However, because I generate huge files, it’s just not even worth pursuing for me, because I’d be able to hold only a couple datasets at a time on them. So for computational power, especially if you’re offloading stuff onto a GPU, just about any modern 6+ core CPU will do fine.

For 3D angular searching, and 3D refinement, things get complicated. The GPU should hopefully do most of that for you with CUDA, although I don’t know much about TomoPy. I know with eTomo, using a Quadro K4000 vs my Ryzen 3700X, the K4000 is done in a matter of a few minutes, the 3700X is done in about 30 minutes. In my experience, the limiting factor when you have a CUDA capable GPU of any sort, is memory and swap. First of all, getting the file loaded takes some time. Then, your computer will need to keep multiple iterations of the file loaded at a time depending on the type of searching or filtering it’s doing. This is because it will need to compare things like cross-correlation and standard deviation of density between refinements to make sure it’s making progress. This can result in massive amounts of RAM being used. I’ve noticed on systems with 32GB of RAM that I am routinely swapping to disk and that slows things significantly, and makes the computer unusable while it’s happening. You know the size of your datasets better than me, but if I were making a new build for microscopy work, 64 GB would be the bare minimum.

So basically, when choosing a CPU, I’d think about PCI-E lanes, RAM channels, and RAM speeds. I think, if you have the budget for it, setting up an Optane cache would possibly be the way to go, but I don’t have access to that hardware so I’ve no way to verify. I’d love to get @wendell some sample data and see what he could do with it, as I’m currently planning out some builds for an incoming grant.

So basically, if it were me, I’d go bottom shelf Threadripper, a motherboard with lots of PCI-E and memory channels, and make sure all the memory channels are populated, a cheap CUDA card in the $200 or less range, (if you’re running Linux and want stereoscopic 3D, make sure you get a Quadro) and spend the rest of my budget on the fastest storage I could get.


Thanks @COGlory! This was a thorough answer. This whole Covid stuff has messed with my experimentation time at the lab so I am not sure how much data I will be allowed to collect since the scanner is busy all the time.

I think I will be able to snag a friends old threadripper build for cheap (1900x GTX 980 with 32GB of 3200 Mhz of RAM) Will it be worth to get and just find a 2000 threadripper CPU and populate it with my own storage solution and RAM?

Or would it best to build from scratch?

Depends on price, but might be worth keeping that build as is, and seeing where you’re bottlenecked by reconstructing some test data. If it’s CPU, I’m sure it’s nothing a 2950X or so couldn’t resolve (but again, I doubt CPU will really hold you back). If it’s memory, should be easy to add another 32+ GB, as long as all the DIMMs aren’t populated, and 3200 mhz should be workable speeds, many Intel rigs use far slower memory. The 980 should be plenty, we still use 980s for a lot of research tasks.

Either way, I don’t see a way that you’d be bottlenecked with no upgrade path on that platform in the next…probably 3-5 years? But asymmetric reconstuction is a tricky thing and kind of depends on your datasets and what you’re trying to do with it.

What I can say is that computer is on-par/better than the computer I’ve been using for tomographic reconstructions and local averaging for the past year or so, and I haven’t hit a huge limit except for storage, with mine.

That sounds like a plan. I will more than likely snag the build and run some test reconstruction procedures and get a baseline.

For the storage, are large NVMEs the way to go? I was thinking of maybe getting some XPG ADATA 8100 pros. Or is there a better setup?

Hey. What kind of image processing are you likely to be doing? I’ve had to process terabytes of tissue pathology slide scans home, and I second @COGlory s comments on storage. It’s been an absolute nightmare dealing with a networked data set, my home network, the university and hospital VPNs etc.

I’ve upgraded to a 3900x (from fx8350) - Good deals for bundles atm - and single GTX 970, and it seems plenty powerful enough a workstation for most imaging and genomic stuff at least. I can’t speak for Monte-Carlo simulations

But yeah, if patient confidentiality isn’t an issue. Those NVMe seem grand. Maybe just stay clear of any QLC types, like the cheaper sabrent

Hi everyone. Not an expert in these fields, but I’d humbly wish to bump this thead up.

N.B. It would be superb imho to view some footage of L1 muscle-machines number crunch medically relevant image data sets as COGlory tantalisingly suggests above.

It seems such a frivolity for anybody to place undue emphasis on fps games when the human race is plauged by huge existenial challenges. Moving forwards the next big frontier is medicine. Personally l see so little point in trying to squander billions to land on a space rock when when it has already been done and satellite launces are now routine.

As my user name aims to suggest; medical advancement is now man’s greatest imperative. These are incredibly powerfull machines -got to be used for more than entertainment and escapism.

1 Like

As of now I am sticking with open source python packages. TomoPy scikit Image …

I am mainly doing contrast adjustments and segmentation stuff to ID oddities. Its more of bringing medical CT techniques to material science. So I will be imaging lots of medical phantoms and variety of metals.

Should I just look for a good TLC?

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.