$35000 surely gets you a nice.... wait WHAT?

This is not exactly my build - I mean I own it, but I did not build it. It was purchased from a pretty well-known SI who shall remain nameless - but if you’re in North America you probably know of these guys. They have been around for a long time, solid reputation, pricey but very solid builds. This isn’t a “got fucked over on craigslist” post. And the price is not a typo.

For that though I’d have expected a little more… craftsmanship.

Rack case, removed top - greeted by cables.

That bundle is just casually laid on the 60mm fan. Doesn’t quite touch the blades so I guess it’s whatever.

The AIO doubles as cable-cooler. There’s high amps in here, you never can be too careful.

Bracket to hold the gpu and… block all air from the only soure nearby? Here I am genuinely puzzled, maybe the GPU is supposed to pull air in from the back of the case through the bracket grilles by sheer negative pressure?

Gotta be.

2 Likes

What the hell in that could cost $35,000?! That’s almost the price of my Toyota Tacoma, which is an actual vehicle meant to last at least a decade or three!

Could you give us the specs, and a photo of the whole thing? Is that just a 5090 or one of the nVidia workstation cards? Are those Noctua fans on the ram? Does it have like a Dual Xeon in it?

5 Likes

Ouch, that’s about what I’ve spent on my desktop and I’m sitting with two 6000 Blackwell cards. I do only have a 64 core thread ripper and 768gb of ddr5 6000 so maybe the cost went into your CPU/ram?

I would take pictures/video before, then cable tidy, then after, and send them to the S.I.

For that money, it might be worth them taking more care for customers, and I bet it was just a couple guys taking shortcuts, rather than the whole company.

Pretty sure a “call in to the managers office” would correct the attitude of the person who should have done the build.

But, I would not loudly shout it, else the lazy person, might just get fired, and that may be excessive.

As for the plastic air blocking shroud… If it was inserted for shipping stability, it really should have been marked as so, and does seem more intentionally like it is supposed to be there I would also reach out to SI, and ask what its purpose is, and the reasoning of shaping air flow around that way, if any

6 Likes

Would do the same.

The guy who put that together musn’t have love for such nice hardware.

It’s just your HEDT in 2025… nothing too exotic about it. $20k for CPU+GPU, $1k motherboard, $4k RAM, $4k storage - next thing you know they’ve actually lost money on it. :slight_smile:

Case is just a short 4U. Not going to take a picture of it because it has these guys’ logo on it.

See above, but basically yeah, plus I can dd /dev/nvme* to /dev/null for hours at 5GB/sec before I have to start over.

I’ve been buying systems from these guys for 20 years, and watched quality slowly slip. Last box a couple of years ago didn’t have the motherboard’s 8-pin auxiliary PCIe power plugged in, so it booted to a BIOS error. It came loose on the PSU side during shipping - which means it was never plugged in properly. I complained to them in the nicest possible way, and they barely acknowledged it. So I guess I am venting here instead.

The acrylic GPU bracket is meant to be permanent. They used to do temporary shipping fixtures with giant “REMOVE BEFORE USE” stickers, and transitioned to these permanent ones a number of years ago. Anything heavy inside like big cards or large air coolers get a custom-made piece of plastic, and they aren’t skiddish about drilling M4 holes into GPU plastic frames if they have to. It’s usually very strong, secure, and never in the way. I really am confused about what they did here.

And yeah the cabling’s getting fixed. The only question is, am I stocking up on genuine Molex MiniFit Jr. crimping parts and doing it myself, or am I ordering “roughly OK length” knockoff cables from China.

2 Likes

I was implying nothing more than cable-ties.

If you need actual cables (like longer ones to fit/route properly), then I would email them and get them to mail them to you free of charge.

It has been many years since I brought a system, and that was cabled tidily for less than a 20th of your price…

Sad to see a company worsen, whatever company it is…

Being in a tech forum, I’d really like to know the specifics like which CPU, GPU, motherboard, how much RAM, how much storage, and which AIO. :slight_smile:

Good move on not giving the company name as it’s not that relevant anyway.

1 Like

That GPU bracket is something else.

I don’t imagine it’d be that easy design a bracket for that particular use case, that still allowed plenty of airflow… at least not one that was still as strong or made out of acrylic but there must have been a better solution, surely?

It really is a small case and there’s no room to put all that excess. Without custom cables it’s going to be an eyesore no matter what. But the weird thing is that these guys stick to one PSU vendor, so I am sure they could get factory cables in any length they wanted.

Here you go sir:

  • Asus WRX90E-SAGE SE
  • Threadripper 7995WX
  • 512GB DDR5-5600 ECC
  • RTX Pro 6000 Blackwell
  • 4x 8TB Gen4 M.2
  • 1600W PSU
  • Asetek 240 AIO

Yes.

The OP photos didn’t show the PCI retainer that 3U+ cases have. I removed it because it was hiding most of the cabling mess. They bolt the acrylic to the GPU and this retainer. It’s simple and effective. The plastic piece just doesn’t have to be this wide; it looks like it’s meant to hold block air from two GPUs.

5 Likes

Why a 240 and not a 360?

I didn’t ask but I think Falcon NW also uses a 240 in their 4U threadripper? I could be wrong.

It doesn’t seem adequate. I know I had to put 9W SanAce fans on a 360 to handle a 400W Zen 5 without the water (and therefore the entire 4U case) going to hell. I had Noctua Industrial 2000s on initially and the VRMs were tickling 120C when I called it quits.

Maybe the thinking is that they’ll let the CPU thermal throttle, the coolant is what it is, and the 120mm intake fan to the side of the 240 keeps the inside of the case manageable? I am not sure - normally I’d say they’re the pros and they’re in business because they know this stuff way better than a rando customer does, but I do have questions in this case lol.

One more nit I found after I got into IPMI: all the fan curves look reasonable but they’re all tied to CPU temp. Even the RAM fans. Like, you’re going through all this trouble (sturdy fan mount custom made to sit on the CPU block and hold 60mm fans in the right spot) and you’re not using DRAM temp sensor readouts for them… why. Maybe because ASUS BMC gives you per-slot DIMM readings and no aggreggate, but I’m really grasping for excuses here.

For such a price tag, I’ll be looking at the system integrator to have checked & dialled every knob in BIOS, and ensure all settings are optimal for the system. Default settings or default with minimal tweaks aren’t justified for the service.

What will you be using the system for if I may ask?

3 Likes

My new main workstation. CAD, EDA, coding, tons of VMs. It might even let me keep 6-8 chrome tabs open, we’ll see. :slight_smile: (I do have a separate LLM box so it won’t have to carry that burden.)

Ubuntu with root-on-zfs, and one massive Windows VM hosted on a zvol for the windows stuff I still can’t get by without. And gaming in there with Looking Glass, IF MIG finally starts working on the RTX 6000 as promised. (Nvidia PLS.)

2 Likes

I have a hard time seeing nvidia doing something to hurt their vgpu license income.

But they PROMISED!

On the one hand, you’re right: the 6000 Blackwell is the first-ever to support graphics within a MIG instance without vGPU ball-and-chain. They added three MIG instance profile types that have the sexy +gfx tag. Which is probably why it’s late.

On the other hand that’s not a huge risk to vgpu. 4 instances on a $10k card in 3 slots (2+gap) @600W is hardly a good value proposition if you’re into VDI. You’re better off with 4, way cheaper / lower power physical GPUs, and not even tied to Nvidia at all then - but if you stick with NVidia you can get 4 4000 Blackwells for the same per/VM VRAM, same (lower) power envelope, half the price.

MIG just lets you slice the 6000 into two 5000s or 4 4000s, matching power & VRAM perfectly, but at a significantly higher overall cost.

1 Like

Falcon NW uses a 280MM Cooler in the Threadripper RAK, with two fans per side as well.

1 Like

Why would you ever buy something like this already built?

The number one rule of PC’s is, if you want something done right, you have to do it yourself.

This applies not just to DIY:ers, but also to big system integrators and OEMs.

If you don’t make it yourself, it’s probably going to suck.

Doesn’t matter what it is. Desktops, Workstations, Servers. It’s all the same. Build it yourself or deal with bullshit.

It’s been that way for 40 years now.

Why wouldn’t you? Some people build their cars, most people buy them. Same with PCs.

I do build PCs, way more than I should. It’s fun. But nothing ever works right on the first try because half of it is experimental, because that’s where the fun is.

So I never build my main rig, I pay a company to build it for me, exactly like the thousand they’ve already built, and test it according to procedure, and document the tests, then send me a product that I don’t allow myself to tinker with - and then I turn it on and it stays on for years. Kind of like you’d expect a work truck to behave. And while some people do build very cool cars for themselves, nobody ever builds their work truck.

3 Likes

I guess my experience has been that every time I have used a pre-buiilt machine it works OK, until I try to do anything even remotely off the beaten path.

Like install some new piece of hardware I need to use, or change something in the BIOS etc. etc.

It’s bad with OEM desktops and workstations, and absolutely horrible with servers. I had a HP server that would get a panic attack and go into “all fans on max speed” when I replaced the crappy RAID controller it came with with a proper HBA, for instance. And those fans were little 80mm 14krpm screamers that could be heard the next state over. It was obnoxious.

What with proprietary form factors, lock-ins, lock-outs, vendor CPU locks, unsupported BIOS functions, etc. etc. I have long since concluded that anything pre-built will only lead to trouble eventually when it needs service, or an upgrade. (because no machine ever stays static over the entirety of its lifetime)

My conclusion has been that it is simply winds up being more of a headache to use anything pre-built than to build my own. Almost every time I’ve tried to use a pre-built machine, I’ve lived to regret it.