Whole house PC with Linux as backbone

Hi all,

Not sure if this is the correct place for this topic since it includes a lot of categories, but it is Linux based. Also related to my other current topic here.

Basically I want to build a beast of a PC with 2 or more GPUs and use it as the main computing power for my whole home, which currently would be 2 PCs and a server.

Ideally I would like to be able to have a fairly weak ITX systems that just have an APU and 10Gb NIC, and stream a full fat Windows VM over the 10Gb network. The client ITX PC would basically just be working as a network interface as well display outputs and HID inputs.

On top of serving up VMs this PC would also be responsible for acting as a NAS for all of my media and such (I have an Avago MegaRAID 9286 card with 6x 4TB spinning rust and 2x 128GB SSDs running cachecade setup currently). Ideally it could do all of this simultaneously while streaming 2 Windows VMs for gaming.

Is this even possible, and would I be able to play all my current games as I do on my full windows PC? How many GPUs would I need? I know at least 2 for the 2 gaming VMs, but would I need a third to keep the NAS and everything running? I assume it will need at least 2x 10Gb NICs, if not a 40Gb.

System specs are planned to be a Threadripper or Threadripper 2, at least 16 cores, either 64 or 128 GB of RAM. Would have a dedicated NVMe SSD for each VM and a dedicated SATA SSD to run the Linux off of.

How do you play on streaming the Games, Steam in home streaming, NVIDA game stream or straight connection to the VM?

Other issue you may run into is that you will have to have two steam accounts and a license per account for a game if two people are playing the same thing.

You could also use that machine as a media renderer as well have it run PLEX or another service and have it do the lifting. You can do that with little overhead with docker or LXC.

Unless you are going to have machine do remote compiles or 3D/CAD renders 64GB of ram should be more than plenty. I would focus more on CAS latency and RAM speed.

I would do three GPUs and have one low spec Nvidia Quadro for the host and maybe VEGA for the VMs so you have two windows VMs with 16 GB or ram and have more in reserve if you are going to do stuff like PLEX and nextcloud. I recommend VEGA since it is a little easier to pass through than team grean and it is easier to have a different vendor of GPU from the Host and VMs

That being said I would recommend having the NAS separate for security reasons you don’t want your files directly touching the edge of the network. Set up a white list NFS connects to the Media Server.

I would look for a motherboard that can do x8/x8/8 or better so future upgrades won’t be bottlenecked.

Add a stewmlink or Nvidia Shield (supports steam in home streaming now) and have couch gaming as an option too.

Following up I have a semi similar setup but have my Media box as a shield and Game host as my Gaming laptop. Stream to my TV and Thin and light laptop.

I’ve had a similar idea, except with USB-C laptop docks as the “clients” connected directly to the host and it’s respective VM via passed-through cards. I haven’t been able to find USB-C to Ethernet adapters, though.

About the only way I can see getting it done as OP visualized is to remote into each VM from each client, which isn’t ideal. VNC or RDP introduce lag and overhead, and it just wouldn’t be a smooth experience.

I want to do the same but it sucks for gaming.

The only real way to game at a distance from your computer is to use something like an elgato hub and a very long optical thunderbolt cable, both of which cost thousands combined.

We need @wendell and the crew to look into a software solution for this as well. Or maybe a fancy fiber-connected L1Techs KVM that does not cost an arm and a leg compared to elgato and thunderbolt.

Some more info on HDBaseT

1 Like

Thanks so much for the HDBaseT suggestion. That’s very close to exactly what I’m try to accomplish. It’s too bad it’s almost exclusively HDMI. It might work too for what @The_Awful is trying to do as well?

I might get a monoprice HDBaseT setup, a cheapish USB 2.0 Ethernet extender, and a USB card and see how it goes. At minimum then I’d have KB, mouse, and HDMI monitor remotely connected to the VM.

Plugable makes them I believe.

Really liking this one: Link

Steam in home streaming isn’t bad solution I use it. Unless he is into really competive games the input lag is more than manageable.

You can set up thunderbolt displays pretty far away from the machine too. Linus has a video about it.

Thanks for the advice. So it sounds like unfortunately it may not be possible to do over the network even if it is 10Gb.

I like the HDBaseT idea, but it seems limited to a single monitor. Not that it’s the end of the world, but I currently run dual Dell U3014s at 2560x1600 (long live 16:10!), as well as have an oculus, so that’s 3 display outputs total.

As for the games, my wife and I both have our own steam libraries so that isn’t an issue. Right now we are setup with 2 separate PCs, 1 of which does triple duty as a file server, media PC, and gaming PC in the living room.

The thunderbolt solution had occurred to me, but since I refuse to buy anymore Intel products, that one is out.

Does anyone make a device that can combine USB 3.1 Gen2 data as well as a displayport signal? I know the type C spec supports this, but I haven’t seen it implemented outside of phones and laptops, and even on the laptops those tend to be thunderbolt 3 ports.

If I could do that, then find an optical type C cable I can see this being a decent option, as it would give me enough video signal to at least drive my 2 displays, and enough USB for whatever I need. Oculus might still be an issue but I could work that out.

For the GPUs I would probably keep my current 980Ti as one card and pick up a Vega64 as another, or maybe sell the 980Ti and go all Vega, as I refuse to buy anymore nvidia products also. Team blue and team green have made a habit of crapping on their consumers IMO, so I just won’t buy their stuff, since AMD has made a comeback I’m OK with that.

1 Like

Where do plan on setting up the server? You may be run the occulus from the Host machine directly.

So I got one of the HDBaseT extenders I linked to earlier. It actually works very well. Zero lag, full resolution. Pretty cool, actually. I made a 30 foot CAT 6 cable and that works great.

Unfortunately, for some reason, when either the USB or the HDMI (or both together) are plugged into the host system, the OS (Fedora 27) hangs at boot. Not sure if it’s something to do with the extender, or if it’s something to do with how Fedora and the passthrough is set up. Kinda frustrating.

If I unplug the extender from the host, the system boots fine. After I log in I can plug everything back in, start the VM, and it works fine. The keyboard and mouse show up as separate USB devices and can be assigned to the VM with no problem.

I’ll do some more experimenting, maybe do a fresh install of Fedora and set up the passthrough from scratch.