I was browsing eBay the over day and saw that the AMD mi50 16gb cards are going for peanuts nowadays and a thought came into my head that these cards might be a good way to play around with AI. Only thing is I can’t really find any reports on using such a card anywhere.
Has anyone on here had some experience with these cards and would be worth the $160 bucks for something like AI or maybe a homelab?
They done/use before a Mi60 server you can see in their post history. Probably in a few days they’ll make a more detailed post about their experience using the Mi50s.
What are people reflashing them to?
I know they use the same die as the Radeon Pro VII and those work great in Mac/Hackintosh.
I’m trying to get my hands on one of them to see if it’s worth it for inference at a low price. Idle power draw is a big deal for me as well, because they’re supposed to run idle with models loaded 90% of the time and only fire up when I send some HA voice assistant requests.
I am running two of them at the moment to play around with some of the distilled Deepseek and Gemma models using Ollama. They work great, just don’t try to virtualize them, they are bare metal unless you really want to try and fix the gpu reset bug. I am also running them re-flashed as Radeon Pro VII’s.
As for idle power draw, each card runs at around 20-25w while idle. I get around ~18 tokens per second running the 32b parameter model for Deepseek on Ollama.
I haven’t integrated it into my HA setup yet, that is on the to-do list. But for $110 per card, I’ve been having some fun.