Help specing PC for AI/LLM/GPT development

I’m looking for advice on specing out the components for a PC to do AI LLM development. Think PyTorch + Hugging Face if you’ve done that kind of thing.

I’m more or less committed to an NVidia RTX 4090 as the GPU, but everything else is up for grabs. I’ve heard W-class Intel Xeons are appropriate, but that gets expensive.

Can someone suggest, CPU, case, SSDs, RAM, power, cooling etc, both make/models and sizing?

Take everything I’m about to say with a spoon of salt.
AI is all GPU, like crypto mining but you want to keep the PCIe bandwidth.
So an old server board will save you a lot.
There are bundles like this, which would allow you to add more GPUs later.


There’s even cheaper older stuff that should be fine. DDR3 and CPUs that match that are dirt cheap.

(A case with 11 expansion slots isn’t common; that’s the size you need for that one.)
Intersocket bandwidth limit effect for you, IDK.
Also, both of those are more suited for dual slot card.

1 Like

Great advice, thank you.

What @MarcWWolfe recommended is a good idea and can save you some money.

Are you looking at doing a rack mount or standard tower conifg? That will be a big decider in your parts.