Threadripper TRX50 & WRX90 Cooling Stuff: Watercooling, DDR5 RDIMM cooling, etc

Thank you so much!

OP delivers, OP responds… what’s happening lol?

In any case, I had to make a small change - as it turns out at least on one of my boards the center-to-center distance of the RDIMM slots is 7.6mm. (ASRock BERGAMOD8-2L2T.) Your design is 8.3mm thicc. I cut it down to 7.4mm: 0.5mm from the inside, and 2x0.2mm from the thickness of the walls grabbing the DIMMs. While I was in CAD i decided to graft on some fins - better if heat can flow directly and I don’t need to mess with mounting a passive cooler on top. I did leave your mounting holes so a shroud or fan mount can easily be added later if needed.

I didn’t think this would be possible to CNC, but PCBWay accepted my test-quantity order, so I guess they have some nice machines.

I’ll post here once the order comes in (they’re backed up and quoted 2 weeks to ship) and will share the design if it ends up being usable.

Thanks again!

3 Likes

Yeah, they are pretty good.

If that works for you later, feel free if you want to make a PR to that repo with the additional files so others that want to passive cool it can use.

Thanks!

Time has passed and things have happened.

As it turns out my intuition was right and they couldn’t actually make the heatsinks as per my original design. They tried to, but those stupid fins I grafted on bent every which way imaginable.

They were SUPER nice about it and refunded the whole thing.

I did revise the design and I am now a happy man with lots of cool RAM. (Also a lot poorer than before. Should have gone with aluminum for everything.)

I did take the liberty to put everything in a new repo since the scope is now so different from your original project, and I did end up changing virtually everything.

If anyone wants to pay insane money to a CNC house for some RDIMM cooling, here’s a thing you can do:

Sexy as hell though.

8 Likes

Hello everyone! As a result, what cooling is the most rational to put on Threadripper? (of ready solutions) From the discussion I realized that many SilverStone XE360 - after a while, are you satisfied?

Probably the SilverStone XE360 is still considered the most ‘rational’ choice for cooling a Threadripper of any generation with less than 400W TDP.

1 Like

Brilliant!

1 Like

What if more than 400W? Only assembly of your cooling system? I do not plan to expand my processor, but I believe that there should be a reserve for heat dissipation. Any advice?)

That would either entail building and implementing a custom loop cooling solution or going with one of the few ‘expandable’ AIO’s like the Alphacool AIO systems.

The problem with them is that the pump used is really only sufficient to push through a cpu block and one radiator. Adding another radiator to the loop really drops the flowrate and the cooling improvement you should get with an additional radiator.

You also have to have a case that can support another radiator location or an open-frame chassis.

To give you a real world example, I have a large case that allows two thick 360mm radiators in a custom loop cooling a overclocked, locked clock 9950X runing 95% of all cores doing math computing along with a 3080 Ti in the same loop. The gpu is also doing math compute full time. That loop is dissipating about 800W and I struggle to keep the cpu in the low 80’s.

2 Likes

@KeithMyers : May want to research parallel vs series flow restriction. https://www.performance-pcs.com/a/blog/blog/post/loop-design-blog

Watching this thread as I just received my threadripper 9985WX and V-Color 512GB DDR-5 6000 ECC Kit. Don’t have my block yet (Enterprise Wasserkühler für SP3/TR4/sTRX4 und LGA 3647/2066 | Alphacool), but will report back in a build log in a couple weeks.

FYI: it won’t be a typical build. Preview, there will be 4 of these fans among others (hat for scale) that each move 450CFM and consume ~6A / 75W, 6 radiators and 4 VPP APEX pumps (d5 variant) in a dual chassis rack mount setup with one 5U chassis dedicated to cooling components (rads, fans, etc.) and the other the PC itself. I am going way overboard and think the final budget will be close to $20K :slight_smile: I suspect with the amount of airflow I’ll have, I won’t have ram temp issues, but time will tell. FYI: The machine will be in my ‘server closet’ in a custom built ‘sound dampening’ rack, so noise is not a concern as I’m in a different room ~ 40ft away.

I started the build thread: Overkill Threadripper 9985WX build

2 Likes

By your own linked document . . . .

At 1 gpm it takes 250 watts of heat to raise the water in your loop 1C. DDC and D5 pumps can achieve 1 gpm of flow through most standard loops, such as a CPU, GPU and two radiators.

Which is exactly my configuration. One cpu, one gpu, two 360mm radiators and one D5 pump at 100% speed. I am achieving 1 gpm flowrate in the loop.

1 Like

Yep, understand and getting off topic,but I thought you were looking for ways to reduce restrictions in your loop, in which case running the rads in parallel (which has other benefits - same delta T for both rads for ex.) and running your cpu/gpu in parallel would increase flow and have less resistance than a series setup.

same article:

A parallel GPU configuration alone, is helpful because the resistance in the loop dramatically decreases when the blocks are put in parallel instead of series. The reason behind this is because when you put resistive components into your loop in series, the resistance is additive, however when they are in parallel, the total resistance is less than the smallest amount of resistance in any one of the components.

lets say I’ve done my share of loops over the years from simple to… well, this ugly mess (worked well enough though):

Hope this helps someone,
Brian

2 Likes

If I had enough room in the case, I would have just plumbed up separate loops for the cpu and gpu.

That is what I do for the server cases. One 360 rad and D5 for the Epyc and one 360 rad and D5 for the two gpus. I never hit more than upper 40 C. temps on either loop.

But the Ryzen 9950X/3080 Ti host case just doesn’t have enough space for mounting another D5. That was a struggle at it was. I also have to fit a 240 rad for the 3080 AIO in the case. The case is not ideal at all for what ended up in it. It is just a Phanteks Enthoo Pro 2.


So back to the RDIMM cooling, I am also looking for a similar solution. I have a Gigabyte TRX50 AI TOP motherboard with a 9975WX threadripper pro. Going Alphacool Eisblock XPX Aurora PRO for the CPU and 2 Alphacool water blocks for the 5090s which I am still waiting for. I purchased 8x G.SKILL Zeta R5 NEO (AMD Expo) DDR5 6400 RDIMM but with the stuff on this memory and the minimal clearance between RAM modules, I am not sure if anything will really work? Possibly @martona 's design might but just looking at my board spacing and the 1st PCIE slot it seems like putting anything on top of the RAM, even if it fit, would make it that 1st PCIE space unusable? Not entirely a big deal as I only have 2 5090s to fit in but I am curious how others are dealing with the spacing and if anyone has put a water cooling solution on my motherboard or RAM yet and has any tips to share.

1 Like

I haven’t tested any of the coolers myself, so I can’t give you a definitive “it fits” or “it doesn’t.”
But I ran into a similar clearance question on another board (for my Epyc server), so I threw together a quick-and-dirty 3D model of the motherboard and did a digital test fit of all components / modifications I was considering.

Here’s the model if you want to take a look and see how I did it:

It’s obviously not a precision CAD file, but it was good enough to get a rough sense of spacing before committing to anything.

1 Like

Those round Asetek AIO adapters are not the best thing to use for the TR/EPYC IHS. First they are overly large in fitment causing your obstruction you see in your photo. Second is that they don’t cover the entire IHS making for poor heat transfer.

Best thing to do would be to install a custom loop and proper TR sized cpu block which will not have any overhang and fit within the stock confines of the SP3/TR4 socket.

The best AIO for TR/EPYC is the Silverstone xe360_tr5 cooler and it too fits within the stock confines of the TR socket.

1 Like

While I support putting radiators in parallel to achieve the same delta T profiles for heat dissipation, I typically highly discourage cooling blocks in parallel unless they are of the same type with generally the same length of tubing from the next up-stream device, as a mis-match in restriction will force an unbalanced flow of coolant through the blocks, i.e. the block with the lowest restriction will receive more (sometimes much more, depending on the degree of restriction divergence) flow than the more restrictive path. Two CPU blocks of the same type in parallel? Generally OK. A CPU and a GPU block in parallel? Proceed with caution.

2 Likes

I created this OpenSCAD for a memory duct fan for my HAVN 420, ASUS TRX50 Sage memory cooling for 3d printing. There are variables that can be adjusted at the top that can help adjust for your case/MB.
RAM_Duct.scad.zip (1.6 KB)

RAM_Duct.stl (438.4 KB)

2 Likes

Current prototype 140mm fan duct, directing air over DIMMs, for Asrock WRX90 EVO mounted in CaseLabs Mercury S8. Duct outlet is 35mm above DIMM channels. First time using Autodesk Fusion.

  • Board mechanical diagram sourced from Asrock Rack site

  • Once I had the layout designed, I printed validation part. Channel layout matters due to PCIe clearance, hence ABCD and EFGH labels.

Full print is underway. I expect further edits. Maybe in a couple days a usable part.

3 Likes

It might be worth it to add some air guides/baffles into the duct to make the flow more “even” on the DIMMs, I’d imagine alot of the air is going to exit near the outer radius of the bend without them. Also the baffles could help make overhangs print more reliably depending on how their situated.

2 Likes

Planning on adding air guides. Also considering ditching the screw mounting for channel / groove or snap mount - something easier remove. Fusion apparently has CFD, might give that a go.

2 Likes