I have recently upgraded my entire homelab setup, because pulling 700+ watts from the wall constantly was starting to make my wallet cry.
The previous setup included a dual 16 core Opteron setup with 192gb ddr3 ECC that housed the majority of my VMs (NAS, media management, IRC server, discord bots, etc). Second server was a Dell R710 (Former Google GSA) that handled some game servers and niche applications. These were haphazardly sitting on shelves in my basement, with a 48 port Cisco switch. All together under normal use this ‘stack’ was pulling 700 watts constantly.
I decided to leverage my large collection of older machines, a few select “new” used parts from ebay and one brand new piece to build a proper rack in a dedicated room in the back of the house. The new stack includes Server #1: a Ryzen 3900X with a matched 128gb QVL’d DDR4 kit with 4TB of NVME and 2x 20TB Refurb EXOS in Raid 1. GPU is an old OEM Dell GTX750 just to have a display out. This machine takes over “Main server” duty and runs the majority of my VMs including Primary NAS. Server #2 is an i5-9600t with 64GB DDR4, 1tb NVME, and 3x 12TB HGSTs pulled from the old server (2x12 in Raid 0, 1x12 for NVR). This server functions as my backup NAS and NVR server. Server #3 is a Celeron J1900 with 16GB DDR4 that lives just to run my discord bots. There’s also an HPE 48 port PoE switch, and a Mikrotik Router.
The new stack only pulls about 250w from the wall under normal use, with the occasional small spike up to 400 during heavy transcoding on my media server VM.
If I were to win the R9700, it would be going into the main server, so I can offload my “local AI” from my main PC to the server, as well as run some bigger models than my 7900x/rx6800 combo on the main rig can handle.
