I have ascended to Linux. Now what?

I finally installed Linux. After many years of frustration with Windows, embracing the ideology of open source software, and being absolutely blown away with my steam deck I decided to move my main Windows 11 PC into a dual boot configuration with Kubuntu 24.04. I’ll have to continue using Windows for specific games with anti-cheat like Valorant, but I want the PC to spend 90% of its runtime in Kubuntu. I’ve been really impressed with the ease of use, and how quickly I have found solutions for software I need so far. Kubuntu has been nice. I even figured out how to set up secure boot, which is necessary for Riot’s anti-cheat, without asking the internet.

However, there is one thing that I’m unsure about, and that’s backups. Under Windows I have been using Paragon’s backup & recovery 17 CE for scheduled full, and differential backups of my boot drive. Paragon’s software can actually read, and backup Linux filesystems, so I’m technically covered until I figure out a better solution. I would ideally like to find a Linux native app that does the same thing since I don’t want to boot into Windows for anything other than playing games. I have both OS installations on the same drive, and I want the Linux backup application to backup all the partitions on this drive.

I’d like to hear experiences from knowledgeable Linux users out there. What’s the right solution for me in your opinion? Thank you in advance for the help.

5 Likes

This is tricky to recommend as I myself use a different approach, but you might consider exploring timeshift which is more like your traditional ‘program that does backups’.

I like borg/borgmatic, but maybe you don’t want to start there.

Whatever you use, especially with being unfamiliar - test the backup restore.

6 Likes

Timeshift is a good tool but do understand that it isn’t a backup utility so to speak. Rather, it saves a system image that affords one the ability to revert back to a previous configuration should an update break the system. Backups of files should always follow the 3-2-1 rule.

1 Like

In true Linux fashion, the answer is already on your system: rsync. Provided your backup target runs Linux too, by default rsync copies only those files that have changed since the last iteration. From that you can extrapolate the initial rsync takes up most time, as it copies over everything. Rsync is a commandline tool, the options on the cli are incredibly powerful and versatile, although there may be GUI options for it. Search Synaptic (the GUI for the package manager) for a solution.

5 Likes

Glad it’s going so smoothly :slight_smile:

That seems like a big ask to back up multiple OSs and all data and expect a restore to work 100% when needed. It’s doable for sure, but at work if I was relying on such a comprehensive backup I would be doing regular restore testing to ensure we can get back what we need and that everything runs.

Do you know you can re-download your games? Where is your game data e.g. on some cloud server somewhere?

I was looking at Duplicati the other day for an all in one solution to OS and data backup. Also cross platform https://duplicati.com/

but not got round to reading in detail, or trying it.

Perhaps split things up a bit? More chance of a successful restore, and even if part of the restore fails you will likely get most of the parts back.

I might suggest if you are just starting out with Linux you keep the data separate from the OS, at a minimum /home on a separate partition, but maybe on a NAS or external drive. That way if you kill the OS whilst learning your data should be safer during a reinstall of the OS, and avoids a restore from the backup.

I try to keep all my personal data fairly consolidated (Dropbox and Synology NAS) so its easy to back up, and I’m not bothered about backing up the OS as I have a clear set of working notes if I need to rebuild. And it’s personal data, which doesn’t change that much, so in the worst case of having to restore from the monthly off site drives, I wouldnt lose that much.

Roughly:

Dropbox: for all our docs and pdfs (I take a shared responsibility approach to backups here where Dropbox is responsible for resilience and backups https://assets.dropbox.com/www/en-us/business/solutions/solutions/dfb_security_whitepaper.pdf)

Synology: music, code repos (which are regularly getting pushed to GitHub anyway) and photos. Backup of all of this using Synology Hyperbackup app to Synology C2 cloud nightly. This is the data I really care about.

Once a month I backup, again with Synology Hyperbackup, everything in Dropbox and Synology to one of two hard drives that are kept off site. One drive for even months, one for odd months. This backup is a full copy, rather than an incremental, which does have some risks. I try to mitigate those risks with the odd/even month drive approach, but with a full copy of the Synology and Dropbox I can restore any file easily with a copy, and can quickly check the backup worked.

Whatever you decide on:

  • test a restore regularly
  • 3-2-1 approach is a good guide

I suggest Veeam:

There are many backup options on Linux, but restoring from a backup on bare-metal is a rather advanced task. That is why you’ll often find suggestions to make an occasional Clonezilla image, in addition to daily backups with another tool (tar / rsync / borg / etc.). Veeam is helpful to the beginner because it can generate bootable media for you, which can be used to restore your whole system from bare metal directly to your most recent backup.

I use duplicity to backup all my servers to Backblaze. I have a full home lab now and I can’t recommend Backblaze. By default the volumes are not set to delete old data. They have some other bad patterns they use to rack up charges you don’t notice for months.

So I’m going to switch over Duplicity to my home lab, which will also make it a lot easier to test disaster recovery.

All my storage drives in my NAS use ZFS. I pop in spare drives and use ZFS snapshots to handle backups. My main box is still on btrfs and it’s snapshooting system is … meh …

I hope to switch my main box to zfsbootmenu and then I can use ZFS snapshots everywhere.

2 Likes

I like rclone for managing access and syncing data to a number of internet backup services and rsync for home network devices. These can be scheduled with a script or command listed in the cron schedule list or run from a systemd .timer file.

K3n.

1 Like

I use Veeam (Community edition, free for up to 10 nodes) , but I run it on a dedicated VM on my home lab to reduce any rebuild issues. Info stored on a QNAP NAS. On my Linux Mint boxes, I also use Timescale for daily snaps. If I can recover using Timescale, great, if not, I can restore the entire VM from a weekly full backup. I keep 6 weeks of backups on hand.

Clonezilla is an excellent Point-in-Time imaging tool, but it’s only as good as the last backup. Something to backup your docs and other changing data daily is recommended. rsync is good and there are a lot of NAS out there with utilities to support backups.

I have a 2TB icedrive space in the Cloud to backup anything I cannot live without.

Comes down to budget and how technical you are. YMMV.

1 Like

I’ve heard of rsync, but didn’t know what it was. I’ll look into this. Thanks!

That’s not the product I was suggesting at all…

You can backup to any NFS or Samba server.

It gives you bootable recovery media for bare-metal recovery

I can’t swear it will handle all dual-boot systems properly (I’m not going to go wipe some systems just to test it), but it should as it backs-up up the whole system.

3 Likes

I see. I have Windows and Linux VM’s so the full blown product on Windows works better for me.

I also use PROXMOX backups for short term projects. That goes to a RaspPi 4 with external SATA 2TB USB connected drive on the other side of my house.

Watched this video this morning. Not had any hands on, but the demo looks comprehensive for individual Linux folder / file storage. Nice feature set and GUI driven.

Pika Backup. (Uses Borg)

2 Likes

There are many ways to backup your data
My prefered method is to back up my files on seperate drives via a docking station.
The benifit of saving working hdd,s is they are cheap and can store many images.
And they can be encrypted
I do not and never will store anything in the cloud.

2 Likes

“There is no ‘Cloud.’ It’s just someone else’s computer.”

3 Likes

Linux is good when you do not do much with your pc. opening firefox in lin or win is the same. As for backup… a bootable iso on usb and like acronis or so do fine.

Exactly data they also have access to and anyone the sell the info to.

Like others have said Acronis on a USB is a great backup tool, clonezilla can do the same.
I do major backups using this, usually once one or two per lifetime of the install.
I use time shift if I need to do something experimental.
All this is written to a non encrypted backup partition or external drive

Thanks to everyone who replied. I think I’m going to use Pika backup to get my important files covered. It’s good to know there are so many options for backups. I am however still looking for an open source option for differential system image backups. I might try UrBackup, but I don’t know if it will work on my system as it is now since it’s a dual boot Windows/Kubuntu on one disk. Maybe I should consider moving the two OS installations to separate disks. That may make system image backups of the two OS’s easier.

Learning to make peace with the fact that you are now going to have to dedicate hours of your time reading forum threads and watching videos in attempt to solve issues with things that just effortlessly worked on Windows or Mac :rofl:.

Its getting easier though. Slowly.

2 Likes