Hi there,
Looking at running Mixtral 8x7B and wondering if its worth it to go with Mi210 or Mi250 vs RTX A100 or L40S. I wonder what the performance would be and if anyone has a setup running with something similar that could provide insight.
I was looking at it from a VRAM perspective too when the Mi210 has 64GB and the L40S has 48GB. So I would need 4x cards to have the same vram as 3x cards.
I’m not honestly sure. This is for a client of mine who is commissioning me to build rigs to run Mixtral 8x7B models for genAi.
After seeing a recent post
I’m wondering where I can even get a Mi250X card to put into a server. None of my contacts seem to have them. They are only in pre built systems that are typically over $100k right now.
Mixtral -8x22B being released a few days ago and the last few models DRBX, grok, command-R are quite big.
so you may want teirs rather than a one size fits all server for this stuff.
What do you mean by tiers? I have a client who wants to run models and they are asking to build servers for them.
I was looking at Nvidia options but when I stumbled upon the AMD stuff I wondered what performance was available and where the GPUs/Accelerators themselves could be found.
Is it just that the Mi250x doesn’t really exist yet outside of special Partner servers?
Doesn’t seem like a good way to penetrate the market…