With the wake of virtualisation at home comes the inflation of server installs. By now I am facing the problem of keeping several machines updated and was wondering if any of you has a more elegant solution than a shell script that ssh into each machine and runs apt-get update && apt-get -y upgrade && apt-get -y dist-upgrade
You can do unattended-upgrades (mainly applies to security stuff though and not packages themselves).
Puppet maybe what you are looking for.
Or you could try it manually.... use shared ssh keys so one machine can sign into all of them. Using a bash script to simply send the commands over and execute them.... much like how I did it in this bash script https://teksyndicate.com/comment/1761001.
It depends on what you are updating to be honest. If this is like production stuff I wouldn't recommend just updating everything all willy nilly. Things can break with updates (not often but when it breaks it breaks and you poop your pants finding out stuffs broken).
If all of you servers are similar I'd say a script that has updates all of them at once isn't bad though. So you can test one and see everything being good and then you can run the script and all servers update. Which it sounds like you already have.
This isn't so much about managing updates but it is handy if you have a bunch of servers running the same OS. check out apt-cacher-ng, it's a caching proxy for apt that will make updating a bunch of systems quicker by not having to download the same updates a bunch of times.
haha yeah. What I like about the ng version is that it acts more like a proxy, so you can keep your sources lists and just set the apt proxy to the apt-cacher server. So you don't have to configure apt-cacher with a set of sources like you used to which I always ran in to problems with.