Blender Rendering Workstation

I’m working on putting together a new workstation for a Blender artist at the studio I work for. The current PC has: 5950X, 96GB RAM and 2x3090FE. The budget is roughly $10K CAD, but willing to stretch it if needed.

For the next PC, I’m looking to do dual 4090, and possibly 3 of them if there’s a way to fit them. I’ve been asking around and on Reddit and have come up with this initial list:

  • CPU: AMD​ Threadrip​per 7960X
  • Cooler: Noctua NH-U14S
  • Mobo: Asus Pro WS TRX​50-SAGE WI​FI
  • RAM: 128GB GSKILL Zeta R5 Neo 128GB
  • SSD: 2 x 2TB 980 Pro, 1 x 4TB 990 Pro
  • GPU: 2 x 4090 FE (open to change)
  • Case: Phanteks Enthoo Pro 2 Server
  • PSU: SeaSonic PRIME PX-1600 ATX

I’m completely new to workstation and server grade hardware, so I’d like to get another set of eyes to see if I’m making the right deicisions parts-wise. Also some questions:

  • Would if be possible to fit a third 4090 card in here without resorting to a custom water cooling loop? Possibly using AIO?
  • If not, am I better off going with an i9 or 7950X? Do those chips have the bandwidth to handle two of these cards. Seemed to be no issue with the 5950X and 3090s. How does the Threadripper compare to a 7970x or i9 in terms of single-core performance?
  • Is a 1600W PSU enough to power both cards and the Threadripper? If I add a 3rd 4090, will I need an additional PSU and how would I go by setting that up in a build?
1 Like

You might want to search up that proposed motherboard on this forum before buying it…

1 Like
1 Like

Could you be more specific? Mainly just seeing issues with Linux, I plan to run Windows exclusively on this.

1 Like

Aren’t all the new TR issues more a WRX90 problem than an ASUS problem? I assume there’s a reason Asrock has basically unlaunched their WRX90 motherboard.

1 Like

I mistook T for W. You can probably ignore me altogether. My apologies!

1 Like

As a Blender user myself, and someone who has done rollouts from business, The only option here is Threadripper + 2 4090’s. The Blender cycles and cuda support is great on Nvidia.

If you are on Windows this is a no brainer.

  • As for RAM, get as much as you can.
  • As for Storage, it depends if your users upload to a local server or a portal. If these are local builds ( strange for a company with multiple users of creativity workloads ) then getting the fastest SSD’s you can is a good idea. The caveat is that I typically run the SSD’s in RAID0 for the fastest performance for large files ( Rendering Small movie scenes or promotional content ).

128GB is actually low in my opinion. Also, the speed here is not too much of a deterent. If they are low CAS they are good. You should be running in Quad channel anyway. This can be dependent on you understanding the work being done.
My example is a small indie movie ( Only rendering the horrific monster scene, Lighting & Environmental effects or Animations )

Running a SSD in RAID, it delivers so much more performance for really large files.

I think it goes without saying that Nvidia just supports Blender better than AMD and even Intel right now, there’s no argument to be had here, especially for a business.

The CPU is fine, but I would look into what is being done as per the projects, 24 core is good, but more never hurst here.

3 Likes

Can Blender pool memory from 2 non-pro cards? Otherwise your maximum scene
limit is 24 GB of VRAM, 2 cards just render it faster. Generally, it will be much more smart to go RTX 6000 ADA + 7950x if your scenes are large (if you are a professional, then I assume this is the case).

1 Like

Good call on the NVLink, Nvidia removed support for it on RTX 4090 cards. The RTX 3090 DOES support it. . .

Noctua NH-U14S is a good cooler, but the NH-D15 is considerably better for the extra 20 bucks. I’ve owned and used both. The GSKILL looks purdee, but you might be able to locate as good as or even better performing RAM for less expense. I would highly recommend taking a gander at Team Group memory if you haven’t done so already. When it comes to RAM latency really is a thing when you are dealing with big work loads and it seems you plan on doing this. RAM can actually perform faster at a lower frequency with better latency so be sure to check the latency of that memory before you pull the trigger. You want to keep those CAS numbers as low as possible.

Bridging those GFX cards may give you around a 25% performance boost but the real question is whether or not you really need it. SLI/CROSSFIRE used to be justifiable to some extent in the past but not so much these days. You might be better off with one graphics card and 256 GB of system RAM to get the best bang for your buck. I can’t really comment on the over priced NVIDIA 4090 as I’ve never owned one, but early adoption is generally a common way of introducing yourself to extra headaches you can do without. Sometimes it is best to go with the earlier generation and wait until all the early adopters have gone through the meat grinder with BOSDs and driver issues etc. these being Microsoft’s new pay to play beta testers. Meh, it is what it is. Once most of the nasty issues with the newest, latest, greatest, bleeding edge card are dealt with and the 5090 is released this would be a more prudent time to buy IMO. You could always flip a 3090 later on to recover some of your investment and get the 4090 when you no longer have to pay through the nose for it.

Even game developers, for example, use single GFX cards in their systems these days because they are adequate. A third card might give you about a 13% increase in performance tops. This is mega over kill and unless you’re really doing some niche computing requiring mega teraFLOPS in the way of mathematical computation you likely don’t need this. You won’t need it for content creation, rendering, or game development. If you simply want to do it because you can well, power to ya but know that the more complex it gets, the more vulnerable you will be to extra issues. (And yeah, I’m a fine one to talk.) :stuck_out_tongue_winking_eye: Just sayin’

Seasonic is notorious for quality, durability, and longevity. I’m thinking it will be more than adequate to run your build with power to spare. Now here’s the cruncher: Given the recent issues with AMD Threadripper that I’ve been reading about I cannot in good faith recommend using it. Admittedly, AMD has given Intel a good swift kick in the pants (which Intel richly deserved) but not unlike the rest AMD has really been jacking up their prices on their Thread Ripper line as quality and dependability continue to drop. IMO Intel appears to be learning from their lessons (although I still wonder how far they’ve come along with their mitigation at the hardware level on the whole spectre/meltdown thing) so I recommend taking a second look at Intel before you pull the trigger on the Ripper.

Being a user of both platforms and no longer a fan boi on anything anymore I can only offer this much: Intel definitely stands behind their product. I know this directly from personal experience.

For what it’s worth, there it is.