Build a VM with full 3D support

I have various hardware and I would like some recommendations on what I can use to set up a Windows 7 VM with full 3d graphics support. My use case is I need to use AutoDesk Fusion 360 to design 3d models. I’ve been running Windows 7 using VMWare Player, but Fusion 360 recently made updates where the 3D support provided by the VM is too slow to be usable. Also, I would like to play some Windows games. Very few, but some.

My primary PC is an MSI GT72s 6QE Dominator Pro G running Linux Mint 18.2, 48GB ram, NVidia GTX 980m. I’ve decided that NoMachine probably offers the best remote desktop experience, with possibly Parsec being a distant second. Parsec doesn’t support Windows 7, which is a non-starter.

I’m embarrassed to admit that despite decades of experience building niche and specialty PC builds, my recent attempt at setting up a VM with GPU passthrough has fallen flat. I purchased a GTX 1060 because it supports Hardware H.264 and HEVC encoding, which should make the VM more performant. Unfortunately, I realized that my Intel 2600k doesn’t support PCIe passthrough. So, I purchased an Intel i7-3770 which supports HW passthrough. Then I realized my Z68 chipset doesn’t support passthrough. :roll_eyes: Truthfully, I probably discovered this months ago when I first considered this but rushed into the project this time, forgetting my previous conclusion that the hardware didn’t support what I needed.

So, here is the hardware I have available to create a GPU VM:

Dell R610 Series 2 48GB Dual x5650 processors
Radeon HD7850 x4 (2GBD5-2DHE/OC 2GB)
Nvidia GTX 1060 SSC 6GB
A couple NVidia GPUs probably not worth listing
ASUS P8Z68 DELUXE/GEN3 LGA 1155, 32GB, either i7-2600k or i7-3770
MSI Z87-G55 LGA 1150 Intel Z87 with Intel Pentium G3220 Haswell Dual-Core 3.0 GHz, maybe up to 8GB ram

I think that’s about it for relevant hardware. From what I see, the only system supporting HW passthrough (VT-d) is the R610. This is the route I was going to try next, but information about the success of GPU passthrough on the R610 is scant. At this point, I’m tired of throwing money away, but I’m willing to spend some $$ if it will get me over the hump and make my previous purchases useful. The original intent was to replace my online GPU vm from ParsecGaming with something I own and control. Even though the VM only costs ~$0.50 per hour with $5/month maintenance, it’s hard to focus on my 3d designs while the thought of paying for the time used nags at me. Now, the problem is fighting the nagging feeling that I could have probably purchased a couple years worth of VM time versus what I’ve spent on HW upgrades.

Sonnet has some interesting Thunderbolt GPU enclosures, though I’m not sure if my hardware will support a Thunderbolt Add-In card, let alone being able to pass through a GPU from an add-in Thunderbolt interface.

I will use ProxMox (KVM) as my Hypervisor.

If you think the R610 might work, then try it.

First see if it has a VT-d bios option, if it does then make sure it is enabled. Then enable intel_iommu in grub, then reboot and check the IOMMU groups. Fill all the PCIe/PCI slots you intend to use in production before checking the groups.

If the slots/cards you want to passthrough are in their own group, then you are good to go. Windows 7 is not supported under OMVF, you should use the Proxmox default Seabios.

Follow the directions here-
https://pve.proxmox.com/wiki/Pci_passthrough

As a followup, I was able to get this working. There were some problems. It seems that the R610 will only work with two PCIe addon cards installed if using the 700w power supply, otherwise it only supports one addon card. This was interesting - the power usage of the system never went over 200w. The other power supply I had was 500w. At boot, the system would detect which power supply was in use and severely limit the performance of the machine - throttling the ram and CPUs to minimum speeds. The system would have worked with my set up - no power for the addon GPU was drawn from the server system. This seemed more of a method to get server owners to buy a more expensive power supply.

The other issue is reduced performance of the GPU. The best results have been achieved using Moonlight and Nvidia GameStream. Steam In home streaming isn’t working, for some reason that I haven’t been able to figure out. Recently I tried Parsec on a rented system and the performance was AMAZING. The problem with Parsec is that it requires Windows 8.1+. The rented system was running Windows Server 2016. GameStream/moonlight might work equally well - the problem may be running a GPU over a PCIe x1 connection. I’m working on a solution for this problem. Another theory is that Windows 7 simply doesn’t have the latest technologies for streamlined desktop streaming.

The results of my attempts so far, with windows 7, are as follows:

Steam In-Home streaming: Doesn’t work. Linux PC is not able to see the Steam service running on the remote Windows VM. Not really sure why - My non-vm Windows 7 PC is able to stream just fine (or was able to, last time I tried).

NoMachine! - This works very well. The Host can use the hardware H264 encoder in the GPU for streaming. Unfortunately, the Linux client for NX doesn’t have support for hardware decoding. The performance of NX was ok, but limited. Might have nothing to do with the software decoding.

GameStream/Moonlight: Supports hardware encoding AND decoding on both sides of the connection. Works very well, but sometimes lags. Performance tests run successfully, but results are very low compared to what I should get with the hardware used. This may just be a hardware limitation as described above.

Currently, I’m looking forward to trying Parsec, but this requires Windows 8.1+. I’m not keen to install any Windows beyond Win7 unless the system is COMPLETELY restricted to the network and internet to just running the programs I need. I have a new thread discussing my problems with setting up this sandboxed windows network.

2 Likes

Something that should work for streaming the GPU VM desktop would be if you were running ESXi on the R610 and using VMware Workstation Pro connected to the ESXi host. Workstation Pro comes with a 30 day trial (ESXi is free for non-commercial use) so at the least you could see if this gives you want you want. The license from vmware costs quite a lot (around $220-250 US), but if you look on ebay or amazon, or google it you can get it cheaper (found a seller on amazon selling for $36).

You would need to VPN into your home network in order to use Workstation Pro to connect to the VM though when out and about, but over LAN it only adds the 0.1-0.5ms delay and allows for attaching USB devices remotely.

Word of warning though, Vmware Workstation Pro is a bitch to setup on Linux and if you update certain libraries, like g++ or gcc, it will brick and you have to do some fixing to get it working again.

Also, I have never used NoMachine, but I’m definitely going to check it out. Yay new projects!