Since I knew that I will be using different IDEs and environments, I decided to learn vim and use vim bindings in all my IDEs to ensure some sort of similarity across all my development environments, but it doesn't cover it all. I've also tried manually migrating my home directory between different computers, but that comes with several issues, primarily that I'd have to try to share it between professional and personal computers. So what I'm left with is basically remembering what I'm using, and then tediously spend a day or more re-configuring newly installed computers with the software and configurations I know I'm using. It feels like I have to throw away my perfectly configured hammer which I know by heart and get a new one every time I work at a different workplace instead of bringing it along and perfecting it further. Surely there's a better way.
How do you keep your development environments/ways of working synced and backed up?
For example, let's say an app has a default keyboard shortcut I don't like. I won't change it. Instead, I'll just get used to it until it becomes muscle memory.
Now the default configurations are my preferred configurations. Of course, there are still a few things I change from the defaults. But since there are so few I can just configure them by hand, because how often am I setting up a new computer? Not very often. And there are so few config changes, it only takes a minute.
The only thing I have to actually sync is my vim config. It's a very small config, but it's still more than nothing. I just store it in a Git repository.
Couple of bits from a guide I wrote recently: https://gist.github.com/aclarknexient/0ffcb98aa262c585c49d4b...
# store stuff here that you don't want in github
[[ -f $HOME/.zshenv-local ]] && source $HOME/.zshenv-local
# depends on 'brew install grep'
[[ $OSTYPE =~ ^darwin.* ]] && alias grep='ggrep --colour=auto --exclude-dir={.bzr,CVS,.git,.hg,.svn,.idea,.tox}'
That OSTYPE is great for all those little bits that are unique to each operating system. The zshenv-local is in .gitignore, so you can keep tokens, api keys, etc etc in there. OSTYPE could even be used to import specific files like .zshenv-darwin or .zshenv-linux-gnuIf you're ssh'ing between your hosts, you can use something like this:
if [[ -n $SSH_CONNECTION ]]; then
ZSH_THEME="afowler"
else
ZSH_THEME="powerlevel10k/powerlevel10k"
fi
It's not fancy and doesn't use cool tools, but my setup works for me just fine.
I pull some information from the local keychain and others from 1Password (with `op`; I will be moving most of the rest to 1Password from the keychain). It can run additional scripts _after_ each update (I use it to update my fish plug-in installations if I change `fish_plugins`).
It has integrations with many different password managers so that it is possible truly keep your secrets out of your dotfile configuration and _still_ vary the configuration based on those secrets.
I keep my config on Github (latest branch https://github.com/alexghr/nix/tree/host/vader). I'm still new to this so a lot of my config is duplicated across different hosts but I want to refactor it to eliminate duplication.
If you haven't yet, I'd recommend giving Nix/NixOS a try. There's a bit of a learning curve but it's very powerful.
I use it on windows, Debian, macos and even Chromebooks and don't have to worry about any kind of syncing. It just works
It has a built in settings sync thing, but as a general rule I strongly strongly avoid the use custom settings (in anything) to avoid the issue you are facing. There is in my experience a huge productivity gain to be had by just being pragmatic and using (and knowing) defaults in everything you use frequently, and not being a useless prima dona if your custom key bindings are missing from every single machine you need to use each day/week/month/year. We jump around servers and computers frequently, so I find it is easiest and most productive to just go with the flow and use the defaults so I can be working immediately as soon as I SSH into a new host or am given a new laptop to use for a 3 week long contract or whatever.
The files you're managing don't even need to be text, though, of course, it makes versioning, etc. a lot easier.
I also check the host name in some of the scripts to modify behavior depending on the machine.
I did a write-up about my whole setup (emphasis on GNU Stow + Git) here:
But there were a few problems with it: - installed packages needed to be managed separately (the mechanisms for which varied by platform) - managing differing configs for different machines (e.g. headless servers, hardware quirks) was done by managing different branches of this repo and rebasing, which was a pain - it was generally brittle and hard to change
I've started to use https://nixos.org/ and it's been like night and day: https://github.com/RyanGibb/nixos/
My config is version controlled, reproducible, manages packages, and is very composable.
Using the nix package manager and https://github.com/nix-community/home-manager this should work on other Linux distributions, MacOS, and even Windows Subsystem for Linux. Although I haven't tried this for myself yet.
The install script and configuration works on macOS and Linux, but sometimes requires updating when running on a new machine (like recently when provisioning a new M1 Max MacBook Pro). At one point I had it installing default packages (particularly a lot of stuff from Homebrew), but ultimately moved away from that because I realized it just kept too much old unused cruft hanging around.
I also have my configuration files setup so that it can be used for work or personal machine. For example, my .gitconfig includes another config when in the ~/work directory:
[includeIf "gitdir:~/work/"]
# work-specific settings (override email, etc.)
path = ~/work/.gitconfig.inc
https://github.com/sethdeckard/dotfiles
Configuration changes are committed and pushed, and whenever changing machines a `git pull && nixos-rebuild switch` would bring the current one up to date.
Things that are automated: installed packages, systemd services, mounts, users and user groups, networking (including VPNs, /etc/hosts), users' environments, users' home folder content (dot files, random useful bash scripts put in PATH, etc), emacs packages.
A few manual steps still remain (adding keys, cloning repos, some desktop environment tweaks) - undone mostly out of laziness to figure them out.
Of course, one has to learn nixos to use this, and that, quite frankly, has been quite pain :). Though, once you get a hang of it, it relieves a huge painpoint: you are no longer mentally attached to any single physical computer.
Would recommend.
Ansible - I can bootstrap a new linux install with all of my configs in ~10min, this has saved me countless hours of time. My linux installs are heavily modified (tiling window manager, lots of custom hotkeys, custom ricing)
Git - I sync all of my dotfiles to git and and can keep them in sync between PCs.
Syncthing - My dev folder and a few other important folders are kept in sync between all of my PCs using syncthing. I can save a file on my desktop and turn to my laptop and the file will almost instantly be there.
# set up stuff
(
VAR=
curl
"$VAR"
brew install # or apt-get install
# ...
)
I write them so I can paste them into the shell at any time and keep my environment updated declaratively, without having to think. This way I never store any local state, not even command aliases. Otherwise I start getting attached to my setup, which leads to anxiety.It's good practice to check out a project repo periodically and see what it takes to get running inside a fresh OS or user. It's handy to do while onboarding new developers too. Sometimes I'll even put these setup instructions at the top of the project repo's readme itself, so it doesn't have a dependency on anything external.
I am mostly using NeoVim now but in the past VSCode with the remote extension worked great too. It's totally platform agnostic (I can even work straight from my iPad). It's also a single instance that I have to manage instead of syncing or making changes in more than one place.
With this setup I can still use docker containers or VM's to make it easy to spin up different types of dev environments but it's all on one computer so super easy to manage. Since I don't have to worry about managing multiple platforms I create simple bash scripts to install everything and create the necessary configs so if I need to wipe everything and start over it's no big deal.
All that said, if I was starting today I would strongly consider just putting everything into nix with home manager and carry that config around in a git repo that installed my packages and their configuration and encapsulated my entire environment.
I'm able to have the same shell with all my Vim plugins across macOS and Linux with changes committed to git. It's been so good, setting up all my command-line tooling now takes about 15 minutes from scratch.
I have multiple machines on multiple operating systems and dev on all of them. I sync nothing across them, instead I adjust to as many of the defaults of Sublime Text as possible and the only consistency is in how I organise the working directory on disk.
This isn't what you asked, but I'm not sure one has to sync them all.
Might be bad of me, but I definitely share it between professional and personal - it's just dotfiles after all.
The biggest difficulty is making sure that the installation script actually matches what you want to happen because it is difficult to test on already finished system. so every so often I spin up a Digitalocean droplet and try to configure my system from scratch using the dotfiles.
I can rebuild my configuration(Aside from some fussy embedded toolchains) in half an hour or so. VS code, a few different linters, swissknife, stack tabs, timestamper, indenticator, pylance.... done.
Even if there's a Windows machine I'm fine. Haven't done anything on mac in a decade but I assume I'd be fine there too.
Also, if something takes more than 5 minutes to learn or set up I will usually at least look for other options, on the assumption that if it takes time to set up, it will probably take time to maintain and modify, because it was probably made by people who enjoy deep customization.
There's lots of things worth learning and setting up, but also lots that... I'm not so sure about.
Right now all my computers run fedora so there's not much difference between them but for some time I used Ansible on macOS too. It worked but was a pain to make my roles compatible for both linux and macOS.
In my experience Ansible is good when dealing with text files or if there's already a module for what you're trying to do (for example managing postgresql users). If what you're trying to do is complex enough that you need to write a custom module or plugin, it is a pain in the ass: we had to do that at work and it was terrible.
Ansible works on Windows but I have no experience with it.
Then for more systemwide configuration, I have an Ansible playbook I run every now and then (configures apps, dock item order, etc): https://github.com/geerlingguy/mac-dev-playbook
https://news.opensuse.org/2020/03/27/Manage-dotfiles-with-Gi...
> then tediously spend a day or more re-configuring newly installed computers with the software and configurations I know I'm using
Config is just dotfiles pull and software is either a bash script, dpkg --set-selections or ansible (a bit overkill).
Another option is dropbox and lots of symlinks.
If I had to start over now I would consider something like github codespaces, but requiring constant internet connectivity would be enough of a downside I would land on syncthing + vscode again.
I do keep my ~/scripts/ directory in a git repo (also this directory is in my $PATH) because they're all very hack and slash "I need to do a thing repeatedly, quickly" scripts. Sometimes I might want to go back a step or two when I really break everything. This repo auto pulls/commitpushes every now and then as this one is more useful to be up to date on all machines than to have useful commit messages
As for live syncing devices I do have unison running on my laptop frequently to sync (when its on) a few directories I might need offline. Mostly everything I need as a dev is in git repos though, so I use manual git push/pulls to sync everything else
I did previously use mackup (https://github.com/lra/mackup) to sync configs but if I'm honest I forgot I had it set up the last time I reinstalled so it got removed from the process unintentionally
Note: I don't sync anything to my work computer. I will never trust adding a conduit for my personal activity or files to my work computer.
I use SyncThing which syncs critical parts of my home directory (.config, project files, misc dot files, git configs for multiple profiles). My SyncThing setup is local network only. I work mostly on my desktop and if I'm about to leave I spin my laptop up real quick, it syncs everything in a few seconds, and I'm out the door.
In terms of projects, this also syncs my project settings from VSCode and IntelliJ. I was a little worried about my apps that support hot code reloading if I left the daemon running by mistake, but honestly I've discovered no local state issues.
My shell configuration is also synced, which is really nice. Once homed (in systemd) becomes more of a thing I'll probably see how I can incorporate that workflow.
I don't do a ton of configuration these days, I mostly just set up VSCode, fzf and my .vimrc and feel pretty happy. But whenever I do want to make a change, I know it's worth the effort, since it can be easily synced to any of my machines.
I do have a setup where my bashrc and zshrc just source a common aliases / functions file, which makes most of my config work across both shells. If I'm doing something tricky then I'll usually take the time to implement it for both shells.
I also have a file called `.system-specific-profile` which is sourced by my common profile, but is not synced - this is where I dump one-off stuff that doesn't need to be synced.
All in all it's fairly minimal but flexible enough for basically anything I've ever needed to do.
- `git clone` said repository to my machines
- `ln` from repository folder to `~/` (can be done in an install script or piecemeal)
- `git pull` to update
I also purposely try and use as many default configurations as possible. I use a lot of remote machines that do not have my configs on them, so I do not want to become too reliant on custom setups.
Linux is essentially automatic. Install OS using custom arch-iso build. Download Ansible config repo. Run playbook.
Windows is a bit more rough. Ansible runs in WSL connecting to the host via ssh.
For work I just fork my personal config, add changes needed for work environment in a private repo. I can pull upstream changes and rebase easily.
Main downside is it's a bit verbose, and I still need manual steps to converge certain types of changes (like deleting/moving configs) because I don't want to write even more code to do that.
Having all the additional steps in a role and not just config files is extremely useful when revisiting a configuration years later.
For development, I usually just use Git to synchronize changes. Syncthing is useful sometimes.
my .dotfiles folder has a script in it that symlinks the dotfiles and folders out to the right places, then installs any tools with asdf and the .tool-versions I save. usually gets any Linux setup up and running in a few seconds.
I then have a workspace folder that stores all my code and projects. I syncthing that between my computer and a 3rd server computer as a backup that has weekly backups running. the 3 way sync makes it easy to not lose anything, or if one machine goes down when I need my most recent files.
Since I use WSL on Windows, the same stuff works mostly the same. The script I have is tailored to my preferred distros. I have currently a fedora one, but I also have an Ubuntu and a Arch one for when I used those or need to get my stuff setup on them. I try to keep all of them updated. asdf really helps with tools.
That's work VERY well, but might not fit your needs depending on what you need to use for work...
Somewhat related question:
Can GNU stow do this, and if not, can anyone recommend a package/app to do it? I want to have a config file with a list of source files and where I want them symlinked. Then have a tool that maintains those links. Example:
# SOURCE # DESTINATION
zsh/zshenv ~/.zshenv
zsh/zshrc ~/.zshrc
gitconfig ~/.gitconfig
From my reading of GNU stow, it operates on full directories and relies on having the files already named correctly.
The advantage of all this is that they are separate so you don't need to worry about some embedded IDE or python crap barfing all over your system. I check all the dockerfiles and config files that they install into git. I've been watching nix but it's seems like it's a bit of a Marmite project so I might wait to see if something better turns up.
This allows me to abstract logic easily, have it all modular and have everything be nix.
There are lots of dotfiles repos on GitHub. And you learn that quickly.
It's more or less a wrapper of git, but it comes with a bunch of features that allow you to do the things you describe.
Another option I considered was Ansible or similar, but my script was very simple to write and has no dependencies other than bash and git which I do kind of like from a simplicity perspective.
Some notes:
1. rcm lets you decide make host-specific or host-agnostic dotfiles. For example, I can declare that I want a different `.ssh/config` file for each host, and rcm will figure out which `.ssh/config` to symlink based on the current machine's hostname.
2. The installation process is very simple. It's just shell scripts, so you don't have to worry about a C/C++ compiler or having any pre-reqs installed. Operating system packages exist for the common platforms, and there's also a convenient way to "build" from source using configure && make && make install. The from source option is particularly convenient if you need to change the installation prefix to a user-writable location on a multi-user machine.
3. I use SSH Agent Forwarding[1] to avoid needing to install private keys (either new keys or copies of existing keys) on all the hosts I manage. This lets me git push and pull to my dotfiles repo on all hosts.
4. Taking it a step further, some shell config I have is host-specific (e.g., certain PATH modifications I only want to apply on certain hosts). Rather than use the host-specific dotfile feature of rcm for the whole .bashrc, I factor my shell config files into multiple files, that I then source. One of these files is called `$HOME/.util/host.sh`, which is host specific. Again, rcm creates a symlink from this to the correct host-specific file automatically by hostname.
If you're curious to learn more about any of this, my dotfiles are public.[2]
[0] https://github.com/thoughtbot/rcm
[1] https://docs.github.com/en/developers/overview/using-ssh-age...
I take monthly snapshots of the VM, and carry it around on all my computers. It drifts from time to time, and I just end up just overwriting it with an older version when I fumble with some settings. Once every 3-4 years I do a clean reinstall of the VM.
I tweak a bit frequently used software like PyCharm, but aim to keep everything in stock and updated constantly.
I've not had any issues or regrets in the last 5 years and I'll carry this on indefinitely.
I dont! I love setting up each new machine fresh, because I feel like during the development cycle I incur a lot of software/CLI tooling debt that I don't need anymore.
Docker for Obsidian and Alfred syncing - the three target limit on the free tier is just barely enough for 2 of my own computers and my work laptop.
I've also got a Brewfile for installing the basic tooling on macOS
I also have a "how to set up a new computer/server" document on Notion that I use so I don't forget any steps.
Between my dotfiles and cloud storage, you could throw my laptop in a lake and I could configure a brand new machine in an hour or two.
In each case I stick with the defaults except for minor tweaks, where I keep a Markdown bullet list - things like switching on hidden files and file extension visibility, or installing obscure things like Rectangle and Silicon Info on the Mac, or where to get my favourite fonts from (eg iA Writer Duospace). None of that is time-consuming enough to script or otherwise automate.
No need to sync anything, but to be fair I habitually reinstall the OS at least once a quarter so I'm pretty practiced at it.
Self-provided dotfiles for shell and the like.
But since I started using VS Code Remote, the "bootstrap" portion doesn't happen all too often.
For development environments, JetBrains IDEs have their own cloud config sync thing and I try to stick to defaults as much as possible (and rely on the language environment's tooling for linting/formatting/etc rather than the one in the IDE).
Now that there are flakes (and the macOS installer is really good now), I can `git clone` my config and bootstrap a new machine with this
sh <(curl -L https://nixos.org/nix/install) --daemon
nix build .#homeConfigurations.{computer}.activationPackage --max-jobs auto --cores $(sysctl -n hw.ncpu) && ./result/activate && rm -rf ./result
Everything is a in git repository I usually call dotfiles.
https://github.com/danisztls/dotfiles
> primarily that I'd have to try to share it between professional and personal computers
My script has a recipes feature so you can store config at arbitrary places and have different recipes/configs for different situations.
Unfortunately, I can't find a perfect remote software, which should be able to automatically adapt to client resolutions, support multiple monitors, use hardware acceleration.
RDP is great, but it does not utilize the high bandwidth of modern network environment, so it's not as performant as it should be.
Put the config files in their own directory and then share them with syncthing and then use symlinks to link to them in the right place on each system.
1. For dotfiles, YADM: https://yadm.io/
2. For everything else, like packages to install, I use an Ansible playbook.
Both sync via git.
I'm another one using a private GitHub repo for this. But I don't need to pull, update, sync or whatsoever. I've been using the same config for years, and in the eventual case that I would change / add something, I can do that quickly in a few computers, no need to have any sort of sync.
For VSCode settings I just use the built-in service.
common files between many environments are just in a common subdirectory. This could be a submodule, but that's overkill, I just sync it via a git path checkout from time to time.
Dotfiles are in there too and symlinked to wherever they're used. Similar with other configs. If they're picky about being symlinked, I'll write a quick copy script to pull them into a subfolder to bring it into git.
as long as things stay reasonably connected and you take a bit of care to avoid split-brain (after powering up a laptop that has been off or away for awhile give it some time to sync) then you can even sync git repos just fine.
First, I keep a personal monorepo. It not only has all my dotfiles, but also my personal website, and binary tools I've written. A game changer for me was git submodules, which let me keep third party stuff up to date with a minimum of fuss. The only thing I keep in a separate repo is my password repository.
So the main script I use invokes a lot of other scripts in order. This is so I can just run one thing to update everything. The first thing it does is sync the monorepo and password repos. Then it runs my homebrew stow command, which is like GNU stow[1] but I have all the control I want over it because I wrote it in Ruby. This not only syncs my dotfiles but also copies all my homebrew bins over to where they're not only in $PATH, but /usr/local/bin so that they're in root's path as well.
Next is package management. I run Arch Linux and use pacdef[2] for lightweight declarative package management. If I install some random package, it will remove it at this step, unless I put it in one of my package lists that I keep in the monorepo. This tool will take any Arch Linux install with paru or yay and make it into mostly the system I use every day. It is the most brittle part of the process because, well, operating systems change.
Finally, to manage all the other package management tools out there, I use asdf.[3] My script installs all the plugins if they're not there, update them, then installs everything. Most asdf plugins support a 'default lib' config, which install a list of packages whenever you install a version of the tool. It's not the wholly declarative solution that pacdef is, but I may eventually write enough scripting to get it there.
It has to be this way because Linux is freedom, and with great freedom comes great amounts of time spent learning how it works. I used to try to keep a portable config across OSX, Windows WSL, and Linux, but that was really hard to maintain, and my life got way simpler when I decided no more OSX or Windows. It's kinda possible to do but it's a ton of work.
But the basic building blocks are, 1) dotfiles, 2) system packages, 3) secondary packages. Get those three scripted, and you're off to the races.
[1] https://www.gnu.org/software/stow/