Server as multiple gaming rigs

So, over the last year I have gotten to play with some simi-old (Sandy-Bridge E5) servers at work. I have had to redo these boxes from the ground up and I am amazed at just how much a single box can do with minimal performance hit between the different servers on board. Lately I have been looking for an excuse to build a 'real' server (something bigger than my old AMD A10 converted FreeNAS server), and I came upon VT-d and VT-d2 technology that would supposedly allow me to pass a GPU (or at least an AMD GPU) through to a VM.

So obviously the big question that comes to mind (and would be a super interesting show idea) is what about building a single server that can run or act as multiple game rigs in a single box? I currently have my PC, my wife's PC, a small Server2012r2 box (learning for work), and a FreeNAS box, and I am looking at building a few game rigs for my kids and their friends in a few years. The idea is that rather than building (and maintaining) several PCs, I could have a single large server in the basement, and then have end-user stations consisting of a display and a Compute Stick or cheap NUC. When someone wants to do something that requires heaver lifting than Chrome or Netflix then they could use remote desktop, or Steam in-home streaming, or some other service to connect to their VM and have all of their stuff on any 'PC' in the house.

Thoughts? Ideas? Would Wendell or someone be willing to attempt building a proof of concept box for a video or something?

1 Like

Linus has done something like this over at LTT with UnRaid. They called it the 7 Games 1 CPU project. Instead of having all those boxes be done locally, you can just have them running in the basement and then stream from them, but the concept is the same as far as I can see :)


As @thijsiez said, Linus made a video of it using UnRaid and it does work but obviously latency is an issue with this setup so games like Counter Strike, Battlefield, CoD, etc might be unplayable in a competitive environment.


Thanks for linking that, and yes, latency does become an issue, unfortunately. But @CaedenV, if you have the funds for a setup like this, I hope you will also invest in your network to support all the traffic this will cause. A full gigabit network is a good place to start and probably necessary if you have more than a few clients at once. I also kinda have some doubts about a Compute Stick being powerful enough to support this streaming, so I'd look into the thin clients Linus is using :)


1 Like here is a video of Wendell doing it with one VM. It isn't 100% what you want but the general idea is the same, you would just be running the VM/VMs on a stripped down distro instead.

1 Like

holy cats... how was this not on any of the searches I did on this? lol
I obviously won't do anything this insane... but I am thinking something along the lines of 2 8 core (16 thread) Intel chips, 128GB of ram, a few SSDs and a bunch of HDDs, and 3-4 GPUs. Should do the trick while staying under $4-5000.

1 Like

Not terribly worried about latency. My kids are currently pretty young (3 and 5), and the games I play tend to be more strategy or turn based, so lag is not the end of the world. Not building today... more of thinking about 2-3 years from now when they will start playing 'real' games on their own.

Plus, the motherboards I was looking at are in the $450-600 range and have dual 10gbps ports on board (a single 10gbps add in card typically costs that much! The price is coming down so fast!). If paired with a similar switch then it should be very similar to playing direct on a box (each end user would still be capped at 1gbps, but they would each have a full 1gbps lane to the server rather than sharing 1-4gbps between them).

... unRAID needs a different name. I have heard abut this before in passing and always assumed it was a virtural RAID controller similar to ZFS or something.

you don't need to worry about eth ports...

get a display monitor hub place it somewhere in middle (closest point to all places that require a monitor)
(i recommend connecting them with dp to hub and then with vga's out - but dp would work also...)

get wiring done with usb outlets and connect them to higher performance usb hub/s (you may have hub per room)
how about making usb wall outlets?

server wise,
it may be hard getting that mobo that supports 3-5 GPU's for servers and has power capability to do so. It very well may be more expensive.
memory wise - ensure you are using all slots; so you have hexa-memory channel or something like that, with ops like that it'll matter.
Get SSD's as a cache; and store all kvm data locally on hdd's. You will still cache everything to memory (you should keep around 8GB of memory for cache per kvm) and ensure you are leaving your server os about 12-16GB of free memory.

  • for gpu's i recommend getting something that has more memory -- thus maybe wait for vega (then you could buy 2 with 16GB)
1 Like

Yeah that'll do fine hahaha :P