I'd like to be more organized about this stuff, but it seems like a lot of setups I see online are for multi thousand dollar hardware, trying to replace the cloud entirely, torrenting and plex, self hosted stuff that needs regular maintenance, etc.
I think I want "Cold storage" files to live on the NAS, and get backed up from there to a 4TB external drive, and everything else to get backed up to the NAS.
Beyond that, I'm not quite sure.
I'd like automatic sync to Google Drive, since that seems to actually be a thing for Synology even though it seems to be a hassle on Linux.
I suppose I could set up the NAS to sync with SyncThing, and then from there the NAS could handle Google drive, or I could just let my phone handle the GDrive sync, so SyncThing doesn't wake up the spinning disk a bunch on the NAS.
I also want to transcode less important event footage to streamingish quality for archival, but I'm assuming just doing that on my laptop manually with Handbrake is probably still the best way?
What do you guys do? What codecs and bitrates do you use for archival transcoding? Any helpful apps you use for manually looking over and deciding what to keep?
How do you handle important vs unimportant files, and stuff that's big enough you don't want to just keep it on your laptop's disk?
Ask HN: How do you manage your “family data warehouse”? (127 points, 10 months ago, 135 comments) https://news.ycombinator.com/item?id=37520374
Ask HN: What tools/strategy do you use for backups? (48 points, 2 years ago, 22 comments) https://news.ycombinator.com/item?id=34197837
Ask HN: How do you manage your important personal documents and other data? (91 points, 2 years ago, 60 comments) https://news.ycombinator.com/item?id=33615785
My Nextcloud server is then backed up to another drive on the same machine plus two off-site locations - an old server I still run at my parents' house, and and external HDD my friend let me plug in to his home server. The on- and off-site backups are done using borg [0], which does deduplication and encryption (with the keys backed up in 1Password).
I've been meaning to set up an automated restore on one of the offsite servers - a script to automatically unpack the latest backup, set up a fresh DB, and make things read-only - firstly to verify that the backups are functional and complete, but also as a backup in case my home server or home internet goes down. I know in theory I've got everything I need in the backups to do a full restore, but I can't recall the last time I actually tried it out...
Fujitsu D3417-A
Xeon 1225v5
64GB ECC RAM
WD SN850x NVMe 2TB
PicoPSU 160W with Bojiadafast Adapter
All in all it costs about 200 bucks (Fujitsu Celsius W550), draws 10W Idle and has enough performance for a home server.OS is Proxmox with native encrypted ZFS root using zfs-auto-snapshots (ransomware protection) and docker LXC with Portainer, Nginx Proxy Manager, Letsencrypt SSL and OpenWRT+duckdns.org to make it available locally.
Docker containers I use:
immich - photo backup
flatnotes - my personal notes
kavita - ebooks
navidrome - music
audiobookshelf - audio books
jellyfin - video
pihole - adblock
stirling pdf - pdf editing
Backup happens to an external usb drive with one small zfs bash script and a tasmota shelly plug (Switch on the shelly plug, wait until it is available, run incremental zfs sync: zfs send --raw -RI "$BACKUP_FROM_SNAPSHOT" "$BACKUP_UNTIL_SNAPSHOT" | pv | zfs recv -Fdu "$DST_POOL"
To keep it save I also backup to my parents house regularly and the most important files via restic
to an old Dell T20 as well as burn it on a blu-ray. I also considered Hetzner Storage Box for this.Google is an absolute no go to me, especially unencrypted.
When thinking about a backup system you can get really caught up in the weeds with having automation and scripts and these fancy ideas of cold storage going to an external drive and then synchronousing with a cloud service and then setting up another thing that will automatically synchronize what's on your disk with a close friend that you have who also has a big NAS for a second off-site storage... None of this matters if you have never actually tested a scenario where you need to get your data back on whatever devices that end up failing. Before things get too bad, simulate a drive failure on your main computer. What would you do if you had to restore things?
Once you've worked backwards from the best way for you to actually get back whole and working then I think you will be able to answer your own question on ways to manage backup.
Implement a 3-2-1 model: 3 copies of your data, 2 locations of media, one copy held offline.
I use google drive, but many people I know like Apples pricepoint and others like the bundling with Microsoft.
Tarsnap is worth investigating. The Author writes here, and is a long standing FreeBSD developer outside of this archive service.
Nothing lasts forever. Be prepared to continue to invest on the capex side because your NAS will eventually die.
Mac laptops backup to the NAS using TimeMachine. Windows laptops backup to the NAS using Backvp2 (using its own scheduler). iOS phones use PhotoSync to sync photos to the NAS. Various real and virtual Linux boxes sync to the NAS using rsync in cron. Music is kept on the NAS, and served via Roon+ARC (atm, anyway).
The NAS has four drive slots. I use two in an LVM RAID mirror for the primary storage volume, formatted with btrfs. A third slot holds the "offsite" backup drive, which takes a btrfs snapshot every morning at 4am. Every month I send the offsite drive to my friend (who sends my other offsite drive back).
I also take irregular Carbon Copy Cloner snapshots of the Macs to an external drive, in case the SSD dies. This makes a simple restore of the OS and apps quick, and then I can pull the latest daily snapshot of the home directories over that.
There's also an archive folder on the NAS for stuff that's no longer live on any laptops, but I want to keep. Old projects, old scanned documents, etc.
I'd love to have a proper hierarchical storage system set up, but I haven't looked into how possible that is in the last few years (and last time I looked, the solutions weren't great).
* Use Time Machine to automatically backup your computer to Synology NAS
* Use SMB Share to manually manage big files or archives when you need it(use rsync if you lazy)
* Use CloudSync(synology app) to automatically back up your Google Drive to your Synology
* Use Synology Photos to (semi)automatically backup your photo library form your phone to your NAS
* Use Hyper Backup to backup your NAS to the cloud using S3/Backblaze B2/rsync.
One thing that might be missing is auto transcoding. 223J might not have enough power to transcode videos, but if you get somehting similar to DS920+, it can do transcoding for you. I've got mine setup with Jellyfin container, and it plays very well because CPU can transcode that.
I keep an external ssd with some important backups and that's about it.
Take printouts of photos as your digital copies will likely go away with you or when you stop paying.