I am curious how you guys manage your files in a convenient and secure way.
I also use SyncThing, some of my files are synced to all devices, including my laptop, which gets backed up.
I would prefer to have a NAS with redundant hard disks, so I didn't have to worry about the integrity of my backups, and I could archive some files I still care about but don't need on my laptop.
But that would be expensive, and I would then need to back up the archive files since Raid Is Not A Backup, and I couldn't just leave anything important only on the NAS and nowhere else(Although some of the stuff isn't that important).
rsync
incremental, obscure but no encryption and no compression because I've had restores fail in the past due to compression
I use LUKS encryption, BTRFS file system, and rsync. I run a script that prompts for the drive's password, then automates unlocking, backing up, and locking: https://gist.github.com/theandrewbailey/3b5bbb3fa3a53c2ef0e1...
You can easily specify a password file via env variable[1]. Make sure file permissions are restricted to root only. If someone had root they could read any file already, so I don't see a threat beyond someone gaining root.
My offsite is done automatically with cron using restic. Then I use rsync for incremental backups to a USB drive (LUKS encrypted). Beyond that, I have a NAS with mirrored ZFS (+ daily snapshots) for live data. I use Syncthing to get my data to the NAS from phones, laptops, etc.
[1] https://restic.readthedocs.io/en/latest/faq.html#how-can-i-s...
This is probably not a sustainable backup strategy, but you'd be amazed how many people abuse their access and how much power you can rapidly accumulate with what one supervisor described as "CIA level opsec".
(Just be careful -- if you get hit over the head and forget your master password, you're gonna have a bad time!!)
I kind of hate using duplicati as the interface is incredibly clunky and restoring data is a massive chore. I recently had to rebuild the media server after a system disk failure and there’s no way to do a data consistency check against the remote backup as all check data is stored in a local database that had to be rebuilt.
I also wrote/use a rather simple Python script that tries to efficiently create archives
I store them on two arrays in different machines locally, but I also encrypt/upload them offsite -- to a dedicated server I use for other stuff
Syncthing some, rsync others, going to test Borg
Also have a 2.5 4TB HDD, where I backup again important data
I don't encrypt any backup
I am more afraid of lose data irrecoverably than someone having local access to it
It has every feature I could possibly want, in a well engineered piece of software at a price that I could not refuse.
ad backup: try to follow the 3 - 2 - 1 rule ~ [at least: 3 copys; 2 different media; 1 off-site :]
using one of the following tools -- heavily depending on the use-case (data/system to backup)
* rsync for snapshots over ssh or nfs
* mt & tar for good-old streamer-tapes
* vorta & borg for workstations
* duplicity for secure backup to insecure networked devices