Alternative to Nvidia Grid Multi GPU virtualization

I have used Nvidia Grid with VMware Horizon View for GPU acceleration of display for work applications and I have seen some other examples of this in practice, like Nvidia’s Cloud Gaming service call Nvidia Grid.

I am trying to find if any alternative solutions are being developed by the open source community that could be applied for home use. It would seem since most GPUs are also compute capable so that emulating multiple GPUs virtually should be possible.

For example using a 2080Ti to emulate multiple 1660 class GPUs. I know this might be an issue with Nvidia drivers/licensing/firmware. Not sure if anything like this is being done or if it is prevent by the hardware manufactures themselves.

The driver commits sudoku if it detects it’s in a VM

1 Like

Nvidia have tied that down to it’s non consumer cards.
Even passing one whole card through to a VM doesn’t always work well, let alone splitting

I’m also curious.
Nvidia seam like a dead end with how they screw with driver (my poor 3Dvison 2 kit is now useless :’( ) but maybe something similar can be done with AMD ?

According to YT it’s possible with very old nvidia GRID cards and old Citrix / VMware … everything else has it fully paywalled and locked down. If you really want the feature, you’ll need an open source GPU, not just open source drivers. Craft Computing played with it a bit, and the impression I got, virtualized gpu’s aren’t cut out for gaming, which makes sense, they’re designed for computational tasks and basic acceleration like CAD/CAM apps might use.

Platforms like Xeon (x79, x99 even), Threadripper and EPYC offer more than enough PCI lanes to give a bunch of VMs their own GPU, and cost a lot less than a GRID solution. I don’t see what the advantage would be for a home user?

MxGPU for amd is even more locked down than nvidia

1 Like

space and power maybe ? the only homelab server i own who is more than 1u is an HP dl380 gen9. it can only fit and power a single gpu if i believe the spec, so a shared gpu is the only useful alternative.
Plus if you are running dozen of vm who are rarely used at the same time, you can easily scale down your hardware.

:sob:

Last time I looked into it there aren’t any licensing fees but you are required to use their workstation cards which cost as much as multiple comparable consumer cards so not much point there
Like a wx 9100 is 1600$, let’s be real for the purpose of cloud gaming it’s a vega 64 which can be had for 300$

Wait what exactly was the use case op had in mind

DRM leases can be used for fully accelerated multiseat where each seat controls one or multiple outputs.
And I believe if one could pass the fd to VM, maybe access to render node would also be needed, one could use it in a way you want it.
It would need a special driver in VM and who knows how security would suffer, but it could work.

My thought on how this could be done would using a cuda or opencl application to run a virutal GPU, possibly using a vBIOS dump to create the virtual card that could then be passed into a virtual machine.

FYI, I only script coding of this level is way out of my league.