I’m looking for advice on specing out the components for a PC to do AI LLM development. Think PyTorch + Hugging Face if you’ve done that kind of thing.
I’m more or less committed to an NVidia RTX 4090 as the GPU, but everything else is up for grabs. I’ve heard W-class Intel Xeons are appropriate, but that gets expensive.
Can someone suggest, CPU, case, SSDs, RAM, power, cooling etc, both make/models and sizing?
Take everything I’m about to say with a spoon of salt.
AI is all GPU, like crypto mining but you want to keep the PCIe bandwidth.
So an old server board will save you a lot.
There are bundles like this, which would allow you to add more GPUs later.
(A case with 11 expansion slots isn’t common; that’s the size you need for that one.)
Intersocket bandwidth limit effect for you, IDK.
Also, both of those are more suited for dual slot card.