I’ve got a workflow that’s a bit annoying and wondered if there were better methods.
I have a ‘hand built’ website that’s doing me very well, no updating of plugins, etc., cheap to host, responsive and I’ve got complete control (so not SquareSpace).
I use NotePad++ to edit pages live and then I periodically use FileZilla to backup files. If I make a lot of changes, I tend to download the whole site (it’s not huge), but if I change one page, I just download that one page.
I just wonder, is there a better/easier way of backing up the site? It would be ideal if FileZilla had a feature that searches for modified files, but that’s not an option!
OH HELLS NO GIT IS NOT A BACKUP! You lose the repo, you are SOL.
If you do need version control and decide to use Git, do make sure you have pretty good grasp on it and at the very least push to a remote repo. Although, that may be vulnerable to your computer getting compromised and wiping both repos. THAT IS NOT A BACKUP.
Depending on how important this data is, I would have an independent system PULLING data from your server into a versioned backup and, if possible, never coming into contact with your PC. You can approach this in many many ways, rsnapshot, NAS, scripts that just zip it up, create git bundles that you save… And have that system backed up to a USB drive every now and then.
Git has taken over the world. I started with SVN many, many years ago, but at this point it only hurts you to not know Git IMHO. I haven’t seen SVN in production environments in a decade at this point, but I’m sure it’s still out there. Shout-outs to my Tortoise-SVN nerds!
2] File Backup
Backing up your Version Control, regularly. My general rule of thumb is 3-2-1 for anything that would be time expensive AND required to rebuild in the event of a total loss. Backing up Source Control preserves the history of bugs/changes/etc so it’s more than just preserving your website.
The way I go about it is to set up a private repository on my GitLab account. For managing the repositories I use GitKraken, but that is purely a personal preference.
And you can also use git to “deploy” code on the production server. You set it up so that your production machine only has access to git pull. That way it also doubles as protection against defacing.
I did look into Git before posting, figured out that Git is a PC installed App and Git Hub is the cloud version that MS use to train their AI…I think!
Ooooh, I’ll look into that cheers!
Cheers Vivante, if possible I’d store the local Git on my TrueNAS server and set up snapshots, perhaps that’s a good way of keeping extra version control, on top of the Git’s own version control!?
Nice one, thank you again. Like you say, if data is important, multiple copies should exist. I’ve got what I think is a reasonable backup system as my work files have high value and I can’t afford to lose them (Architectural). I have:
A daily server that’s backed up to:
an every other day server.
another server weekly.
another server fortnightly.
another server monthly.
And also to Backblaze.
Obviously this is not exactly controlled by a script, I just turn them on when enough work has been done!
Also I use syncthing, so if it’s most convenient I might put the Git in a new syncthing folder on the PC, which is sync’d with the daily server, which then sync’s to another 2 machines. In turn the syncthing data has snapshots on the TrueNAS server, so hopefully that’ll be sufficient.
I’m by no means an expert when it comes to Git, but I think I remember some advice against using network shares directly. Something about filesystem events and/or atomic operations, maybe, IDK?
You can probably init repos on the network share and localy than link them together. Or you could spin up a full Git server on your TrueNAS.
Thank you for that warning, very much noted! Perhaps I can use SyncThing safely though?
I’ll have a look at Git’ing my TrueNAS server…I’m only using Core though, so it might not work very well as it’s slowly being discarded from a plugin aspect.
I don’t really see the need for Syncthing, Git has built in utilities for “sync” and than some.
Check out the quick guide it is very useful if you have no idea where to start: https://git-scm.com/docs/giteveryday
If you are REALLY into it, they have a book freely available:
Ah, I was only thinking that if the revisions were located on my PC, if the PC goes bang, I’d lose the revisions, so thought making it a SyncThing folder might be wise?
Thanks for that, I must admit so far I’m not finding it intuitive at all, but it’s very new to me, with only previous experience of NotePad++ and the usual FTP tools.
That’s good to hear, it might be a little OTT, but better to be that than the other way and just hope stuff keeps working!
There is many interesting suggestions. I’m gonna add stuff on your back’s mule :o)
Git doesn’t require any server for hosting. No need for GitHub. There is alternative to GitHub, like GitLab, Gitea, etc. But if you’re ok with raw dogging Git, and no interest for specific CI/CD, hosting your Git through SSH is enough.
Your NAS should provide some kind of SSH.
On the NAS, initialize a bare empty repository (paths are to adapt):
You would configure you local repository by adding a remote:
git remote add local-nas dev-on-nas-user@local-nas-name-or-ip:repositories/my-awesome-website.git
git push local-nas main
For syncing, you could check rsync. It’s a very versatile tool, useful to synchronize files. Combined with git archive, it’s a must.
For external backup, I’m using a European cloud-based block storage using rclone. The account I configured to be used to push the backup is a write only agent, which is great to prevent any backup destruction from a compromised account.
A last piece of advice: if you’re using Git to manage your code, you may risk to push the .git directory to a published directory. It’s a good idea to ensure that your favourite web server disallow access to any file containing .git.
I think most people already said everything important. I just want to stress, that you can absolutely have just a local git repository and be done in terms of version control. If you then back the repository up, you’ll be fine.
You can gain more from git by also using remote repositories, but not as a backup, rather an additional way of version control. You’d still want to make regular backups, but using a remote repository would help in syncing things across multiple devices or - if you chose so - people.
GitHub and GitLab (and others like BitBucket and Gitea) are mostly management systems “on top” of git, that help you manage git in a visual manner - rather than CLI (command line interface).
My mule’s back is strong at times, and weak at other times
Thank you very much for all of this, I think this is probably something I should do on TrueNAS Scale when I’m eventually forced to use it. Core just isn’t cared about anymore. I do see Gitlab on there, but not Git - and even then, that’s community based plugins.
Thank you very much for the instruction though, I’m going to note them for when the time comes! I do have a spare machine with Scale on, but I just use it periodically at the moment.
Rsync is a helpful tool indeed…and I’m glad I know at least one thing about what you’re talking about!
Thanks for the headsup about accidentally publishing, noted!