HACKER Q&A
📣 eternityforest

How do you manage files and backups as an individual?


Just ordered my first NAS(Synology 223j) and a 12TB(Refurb on Amazon )disk, and I'm not entirely sure what to do with it when it gets here.

I'd like to be more organized about this stuff, but it seems like a lot of setups I see online are for multi thousand dollar hardware, trying to replace the cloud entirely, torrenting and plex, self hosted stuff that needs regular maintenance, etc.

I think I want "Cold storage" files to live on the NAS, and get backed up from there to a 4TB external drive, and everything else to get backed up to the NAS.

Beyond that, I'm not quite sure.

I'd like automatic sync to Google Drive, since that seems to actually be a thing for Synology even though it seems to be a hassle on Linux.

I suppose I could set up the NAS to sync with SyncThing, and then from there the NAS could handle Google drive, or I could just let my phone handle the GDrive sync, so SyncThing doesn't wake up the spinning disk a bunch on the NAS.

I also want to transcode less important event footage to streamingish quality for archival, but I'm assuming just doing that on my laptop manually with Handbrake is probably still the best way?

What do you guys do? What codecs and bitrates do you use for archival transcoding? Any helpful apps you use for manually looking over and deciding what to keep?

How do you handle important vs unimportant files, and stuff that's big enough you don't want to just keep it on your laptop's disk?


  👤 gnabgib Accepted Answer ✓
Similar Asks (perhaps some inspiration?):

Ask HN: How do you manage your “family data warehouse”? (127 points, 10 months ago, 135 comments) https://news.ycombinator.com/item?id=37520374

Ask HN: What tools/strategy do you use for backups? (48 points, 2 years ago, 22 comments) https://news.ycombinator.com/item?id=34197837

Ask HN: How do you manage your important personal documents and other data? (91 points, 2 years ago, 60 comments) https://news.ycombinator.com/item?id=33615785


👤 maples37
I started using Nextcloud for file/contact/calendar syncing a few years ago, and have gradually moved most of my digital life into it. Documents and photos, but also scripts to automatically set some things up for me when I do a fresh install of Linux (I've been playing around with a few different distros lately). The only thing that doesn't live in Nextcloud are some old DVD rips, and that's mostly due to "haven't gotten around to it yet". Besides those, if it's not in Nextcloud, it's not something I care too much about losing if a disk were to fail or I decide to wipe the disk to try a different distro.

My Nextcloud server is then backed up to another drive on the same machine plus two off-site locations - an old server I still run at my parents' house, and and external HDD my friend let me plug in to his home server. The on- and off-site backups are done using borg [0], which does deduplication and encryption (with the keys backed up in 1Password).

I've been meaning to set up an automated restore on one of the offsite servers - a script to automatically unpack the latest backup, set up a fresh DB, and make things read-only - firstly to verify that the backups are functional and complete, but also as a backup in case my home server or home internet goes down. I know in theory I've got everything I need in the backups to do a full restore, but I can't recall the last time I actually tried it out...

[0] https://www.borgbackup.org/


👤 sandreas
I use the following config:

  Fujitsu D3417-A
  Xeon 1225v5
  64GB ECC RAM
  WD SN850x NVMe 2TB
  PicoPSU 160W with Bojiadafast Adapter
All in all it costs about 200 bucks (Fujitsu Celsius W550), draws 10W Idle and has enough performance for a home server.

OS is Proxmox with native encrypted ZFS root using zfs-auto-snapshots (ransomware protection) and docker LXC with Portainer, Nginx Proxy Manager, Letsencrypt SSL and OpenWRT+duckdns.org to make it available locally.

Docker containers I use:

  immich - photo backup
  flatnotes - my personal notes
  kavita - ebooks
  navidrome - music
  audiobookshelf - audio books
  jellyfin - video
  pihole - adblock
  stirling pdf - pdf editing
  
Backup happens to an external usb drive with one small zfs bash script and a tasmota shelly plug (Switch on the shelly plug, wait until it is available, run incremental zfs sync:

  zfs send --raw -RI "$BACKUP_FROM_SNAPSHOT" "$BACKUP_UNTIL_SNAPSHOT" | pv | zfs recv -Fdu "$DST_POOL"
To keep it save I also backup to my parents house regularly and the most important files via

  restic
to an old Dell T20 as well as burn it on a blu-ray. I also considered Hetzner Storage Box for this.

Google is an absolute no go to me, especially unencrypted.


👤 navjack27
I don't know I just manually manage everything on my computer. I have a older computer that I use as a NAS that has all solid state storage in it which is a mixture of SSDs and NVMe storage and it runs RockStor. As I know I'm going to need something I just throw it in my NAS. Over time when I need space I will throw in my Intel ARC A750 to transcode movies or TV shows I have downloaded to AV1 using Staxrip. No set bit rate for that because that is stupid and inefficient. Read about the codecs that exist for audio and video and once you learn about them you know how to correctly use them and you could use a quality setting instead and get more efficiency.

When thinking about a backup system you can get really caught up in the weeds with having automation and scripts and these fancy ideas of cold storage going to an external drive and then synchronousing with a cloud service and then setting up another thing that will automatically synchronize what's on your disk with a close friend that you have who also has a big NAS for a second off-site storage... None of this matters if you have never actually tested a scenario where you need to get your data back on whatever devices that end up failing. Before things get too bad, simulate a drive failure on your main computer. What would you do if you had to restore things?

Once you've worked backwards from the best way for you to actually get back whole and working then I think you will be able to answer your own question on ways to manage backup.


👤 ggm
Run an OS like TrueNAS (core == BSD, scale == Linux) and back your data on ZFS so you can run snapshots, and write snapshots to offline storage.

Implement a 3-2-1 model: 3 copies of your data, 2 locations of media, one copy held offline.

I use google drive, but many people I know like Apples pricepoint and others like the bundling with Microsoft.

Tarsnap is worth investigating. The Author writes here, and is a long standing FreeBSD developer outside of this archive service.

Nothing lasts forever. Be prepared to continue to invest on the capex side because your NAS will eventually die.


👤 __d
I use a NAS, with some custom scripts.

Mac laptops backup to the NAS using TimeMachine. Windows laptops backup to the NAS using Backvp2 (using its own scheduler). iOS phones use PhotoSync to sync photos to the NAS. Various real and virtual Linux boxes sync to the NAS using rsync in cron. Music is kept on the NAS, and served via Roon+ARC (atm, anyway).

The NAS has four drive slots. I use two in an LVM RAID mirror for the primary storage volume, formatted with btrfs. A third slot holds the "offsite" backup drive, which takes a btrfs snapshot every morning at 4am. Every month I send the offsite drive to my friend (who sends my other offsite drive back).

I also take irregular Carbon Copy Cloner snapshots of the Macs to an external drive, in case the SSD dies. This makes a simple restore of the OS and apps quick, and then I can pull the latest daily snapshot of the home directories over that.

There's also an archive folder on the NAS for stuff that's no longer live on any laptops, but I want to keep. Old projects, old scanned documents, etc.

I'd love to have a proper hierarchical storage system set up, but I haven't looked into how possible that is in the last few years (and last time I looked, the solutions weren't great).


👤 mindwork
You've got everything that you need in Synology.

* Use Time Machine to automatically backup your computer to Synology NAS

* Use SMB Share to manually manage big files or archives when you need it(use rsync if you lazy)

* Use CloudSync(synology app) to automatically back up your Google Drive to your Synology

* Use Synology Photos to (semi)automatically backup your photo library form your phone to your NAS

* Use Hyper Backup to backup your NAS to the cloud using S3/Backblaze B2/rsync.

One thing that might be missing is auto transcoding. 223J might not have enough power to transcode videos, but if you get somehting similar to DS920+, it can do transcoding for you. I've got mine setup with Jellyfin container, and it plays very well because CPU can transcode that.


👤 blackbaze
Old crappy PC has a shared folder on it. That folder is backed up using backblaze. Dump stuff I want backed up from other PCs there.

👤 evnix
OneDrive family plan turned out cheaper than having to setup everything myself with Multiple phones, laptops and OSes with elderly parents not living close to you.

I keep an external ssd with some important backups and that's about it.

Take printouts of photos as your digital copies will likely go away with you or when you stop paying.


👤 abrookewood
Old HP Microserver running Ubuntu & more importantly ZFS. Drives are two sets of mirrors. All contents get hourly/daily snapshots which persist for 35 days and then offsite backup using Duplicacy to S3. Roughly 500GB for around AUD$10 per month.

👤 t_believ-er873
I use a backup tool, Xopero, for automating by backups

👤 lobito14
Nextcloud hosted on my local network and syncthing.