I am currently planning to make a Threadripper virtual machine that runs on top on Linux. The idea is that I would run two VM’s (Windows and MacOS) with a GPU and USB controller passed through for each. The host will also have a GPU for gaming and 3D Modeling (as a hobby).
In terms of a budget, I haven’t set a budget limit. However, keep in mind that I am just a college student that does not make a lot of money. Try to go with the best bang for the buck (ex: Please don’t make me pay for a $300 computer case or a $1000 GPU…).
Prices include shipping, taxes, rebates, and discounts
Total (before mail-in rebates)
$4456.07
Mail-in rebates
-$80.00
Total
$4376.07
Generated by PCPartPicker 2017-11-07 12:12 EST-0500
Questions
Now let’s get on to the questions I have!
One of my concerns is GPU cooling. I took a look at how SLI builds are managed to see what type of configuration is commonly used. It seems like blower-style GPU are a good solution, but what do you think? Should I go with blower card or an Open-Air one?
Going from my notes, it seems like a rx580 is the best solution for my Hackintosh VM, but I heard that not all rx580 card work well with GPU passthrough. Is there a list of well-known rx580s that work nicely with passthrough?
Should I go with a Nvidia GPU for my Windows VM? Or is there another AMD GPU I can use for Windows.
Do you think the case I choose is fine? I am mainly looking for a case that has at least 8 Expansion Slots.
Thank you all for taking your time to read my post!
Typically, in multi-gpu machines, I go for either blower-style or watercooled, unless I can get enough airflow in the case.
Not all that sure about this one. Best bet, talk to the folks who’ve done this and see what GPU they’ve done it with.
I’d go for a 480 or a 390x if you can. Stay away from nvidia for passthrough at all costs.
The bottom GPU may be a bit tight, but if you use the bottom slots for the USB controllers, you’ll be fine.
Additionally, the memory you chose is not on the HCL. You’re going to want to find some different memory to make sure it will run properly. Also, with all Zen based chips, you’re going to see significant performance gains from increasing your memory speed. I’d look for something 3200MHz or higher.
Allow me to show you a picture of my test rig: Corsair 800D with an R9 Fury in the bottom slot:
Hardware Compatibility List
32GB will probably be fine, and you can always upgrade to 64 later if you need.
Primarily airflow, but theoretically, a magnetic field from the PSU could also affect a GPU.
If you wanted to buy used, you’ll avoid sales tax and (as an added bonus) get a discount from retail. Don’t let a seller bully you into the whole “sales tax is part of the discount” thing if you’re in the US.
I switched the RAM to something from the HCL that’s 3200 MHz like @SgtAwesomesauce recommended. Somehow that saved money too while still being 64 GB. (The MSI site specifically lists this model in this configuration as working)
When spending a semester’s worth of money on just a computer I don’t see the sense in not going NVMe. And swapped in a non-NVMe SSD for still-pretty-fast storage. Two 5 TB drives because they save some money compared to a single 10 TB and you can just JBOD.
As an addendum to that, I see you mentioned college. If you have a .edu email address you can buy Western Digital products directly from their website for 20% off. Their prices are already competitive before discount, so you could save a pretty penny that way.
Cheaper PSU that is still very good. Running the math it would take a couple years to make up the cost difference between the Corsair you picked and the Seasonic I picked, and that’s if the entire computer was at 100% load 24/7/365. Anything load less than that and it would take even longer to make up that $190 difference.
All in all that shaved off about $300 and got better performance, but don’t forget to add another $10-15 per fan that you’ll need to fill out your case.
Wow, Thanks dude! I applied your parts onto my list now!
Wait… You don’t need to buy all the ram sticks in one packet! I thought you had to do that…
Those are some pretty nice NVMe drives you listed! However, I do need to double check the iommu grouping situation. I remember hearing how the NVMe lanes may be grouped with the GPU or USB controller.
It would be overkill to have a Windows or Mac VM have 1.1 TB passed through. I would rather use that space to make a shared drive that Windows, Mac, and Linux can access simultaneously, but that all depends on the IOMMU grouping.
When I originally picked out the MSI motherboard, this is what I had in mind in terms of placement of PCI devices (ignoring MVMe drives):
GPU 1 (x16)
GPU 2 (x16)
USB Controller 1 (x16)
USB Controller 2 (x1)
GPU 3 (x16)
Going off of your suggestion, the better solution would be to do it this way, if I understand you correctly:
GPU 1
USB Controller 1
GPU 2
USB Controller 2
GPU 3
I checked the other motherboards to see if this configuration was possible, but it does not seem like this is possible. There is always going to be two GPUs that sit next to each other.
If you’re buying the same model number at the same time it should be fine. Mixing and matching RAM with different memory dies it what really messes things up. I.E. don’t buy two sticks with Samsung dies and two sticks with Hynix dies and expect them to play nicely.
Only the WD Black is NVMe, the Crucial MX300 is a SATA drive in the M.2 format (M.2 ≠ NVMe per se).
I always forget about that. I really just picked the Crucial SSD because it would just mean less wires, but it will not be any faster than a normal 2.5" SSD, so going with a 2.5" SSD would accomplish the same thing.
To be honest though, I wouldn’t look too much for the compatibility list. Realistically >80% of memory kits aren’t on any compatibility lists because there are just too many kits out there and there is no way to test them with all launched boards.
It’s different in the enterprise of course, but consumer stuff works in 90% of cases. I’ve just had a single DIMM not work in about 18 years and that was in the Pentium 3 days… (sweet 256MB memory stick )
The kit he chose would work fine, but I agree that faster RAM is still good for anything AM4.
What you could do (but that is really personal preference) is turning the PSU upside down so the fan is on the top, giving the GPU a bit more space and sucking air out as well. Many people don’t like it that way though.
Technically you don’t need to buy them in one packet. The problem with that is that even sticks of the same model number can have incompatibilities (due to every chip being unique). The kits are matched to one another and tested to work.
Imo not really. Especially the WD Black, they are kinda underwhelming for the price. For like 40$ more you can get the Samsung EVO 500GB variant… And look at the difference for those 40$.
Also the MX300 is SATA (not bad tho).
Sucks that PCpartpicker doesn’t actually list that, just the keys but who remembers those
This is true on Intel, but with Ryzen and TR, you’ve got to be more careful. They really like the Samsung B die memory for overclocking. Even with X370 AGESA 1.0.0.6, there are ongoing problems.
Yeah that’s what I was getting at. But also look at how few of the B-Die Kits are even on the lists. And as much as I read that situation got a bit more relaxed with the AGESA Update. The Hynix chips are still not on par, but it’s not as bad as it was on launch day.
I thought I wrote this somewhere, but apparently I deleted it again x_X
To this day, I still don’t know enough about memory to confidently make a suggestion on what’s going to work well, memory is such a shitshow right now.
Not to mention that most manufacturers don’t even list what chips they are using (well, for Crucial it’s obvious, but the rest…)
You “can” go with “certified” or “tested” for Ryzen, but you probably pay more for the badge even though it’s the same kits… like the women’s shampoo kinda deal…
Yeah. It’s extremely frustrating, especially considering memory prices. I want to buy memory that’s going to work in my system, but it’s really a discouraging factor when there’s so much uncertainty going around and the prices are so high that I need to take out a home equity line of credit to finance 64GB of ram.
So I updated my Storage and swapped one 512GB NVMe drive for two different 256 GB NVMe drives. These should actually be NVMe drives unless I messed up.
With that being said, I don’t know if these drives can actually work with passthrough though…
I am thinking of waiting for an AIO GPU for Vega 56. That way, I can have two GPUs next to each other.
Anyway, I have updated the OP to include the new parts.