This is my first post on here and this feels like the best community to ask the afore mentioned (Title). Reading through threads this last couple of weeks you are all so knowledgeable and willing to help one another.
I have 3.5GB of important files I want keep long term, some of it being sensitive documents with personal details.
With multiple copies locally across 6 drives, some ssd’s and some spinning rust.
I want to have a cloud back up for 1: ease of access remotely and 2: a proper backup in case my house burns down or something you never know.
What is the best way to encrypt files to be stored on Google/One drive?
I know you can do password protected zip/rar folders but how secure are they really?
Check out Cryptomator, maybe it’s a good option for you. It will encrypt a folder within your cloud on the fly automatically. Desktop clients are free, and they have paid mobile clients for Android and iOS if you feel like supporting their work (but you can probably build your own from sources for mobile if you have the know-how).
It’s open source and as far as I know has pretty good track record.
Lately I have been using gpg —encrypt. If you use Yubikeys with your gpg private keys loaded you can encrypt stuff that only your Yubikey can unlock in this manner.
WinRAR, 7zip, Veracrypt, gpg… There’s no point in complicating your life too much.
The question is whether we will operate on a small number of large files or small files in a large amount.
How often we will change the content of the archive, what we need in terms of emergency extraction of data from the archive.
If it’s a large file, a full disk backup image that holds a range of sizes, I’d use veracrypt.
If it’s a large number of small files with a fixed number and structure, then I would choose veracrypt.
If these are different files of different sizes and a dynamic structure with a frequently changing layout and the need to extract not everything but specific data, then I would choose rar/zip.
If you want deep separation per file, I would choose pgp/gpg.
Instead of thinking what to choose for encryption/archiving, think from the other side what will fit your data recovery model in an emergency.
How fast are you able to get to the data you need… If you only need to extract 100MB, won’t you have to download a 30GB archive.
Choose the model that best suits your data recovery method.
Avoid large archives if you need to access small files. Use large archives only for images of entire disks/partitions.
Segregate your data into different substroctors and archive them separately so that you can select only the data that you really want to recover at the cloud level.
In my opinion, it would be good to change the operating model.
Sort the data into several layers. A large archive for the entire disk/partition. Other data should be sorted by size and frequency of potential extraction.
The model of operation will look different for copies kept locally without the need to send them over the Internet and later download them when necessary.
So what if we have a 1-to-1 fully up-to-date backup if it’s 200GB and we only have 2Mb/s… time, time, time…
What if you suddenly deleted a 500KB pdf file and you need to download 50GB to recover it from your online copy… time, time, time…
Yeah you’re right I think I was over complicating things.
I’m only looking at around 3GB of data which I’ll add too maybe once a month, so a strong password protected 7zip shall suffice for now.
I also have limited upload bandwidth, so no full disk backups at the moment but definitely something I’d like in the future. When it comes to that then I’ll use Veracrypt .
For your needs, consider using CloudMounter. It allows you to mount Google Drive or OneDrive as local drives and encrypt your files seamlessly before uploading them. The encryption happens locally, so even if someone gains access to your cloud storage, they won’t be able to read the data without your encryption key. It’s an efficient way to combine security and convenience for cloud backups.