Managing 2 consumer grade pcs

New here so I’m not sure if I have the right topic/tags but here it goes.

I would like to preface what I’m about to say with: I’m a Data Science PhD student and my grant manager has already purchased/allocated to purchase this hardware. I had/have no say and I’m grateful to even have these resources.

I have 2 computers coming my way, both identical:
13900k
128 gb ddr5
asrock taichi
2 x Gigabyte 4090s
2 samsung 980 pro nvme or 990s if they are actually released this month
ICYDOCK hotswap tray in the 5.25" bay for additional sata drives
EVGA supernova 1600 t2 psu

I was wondering if anyone had recommendations for managing these computers so that they maintain the same versions of python/matlab/c/cuda and associated packages. At this point I’m quite familiar with SLURM workload manager and am wondering if there is something similar for consumer hardware and or looking for recommendations for distributing workloads evenly.

Really just any tips/thoughts/recommendations would be greatly appreciated :slight_smile:

Ansible?

These days almost everyone uses docker to manage their “portable”/“repeatable” environment and/or docker-compose - it pushes you to write your container build scripts to not use anything from the host.

For multiple machines, most homelabbers would stick to proxmox VMs, or rancher k3s with NFS (not even using the NFS provisioner, NFS host mount and k3s pointed at it).

For keeping hosts in sync with stuff, generally you probably want them up to date with latest versions of everything anyway - depending on your distro whatever auto-updates you get, should probably do the job.

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.