HACKER Q&A
📣 gtirloni

Is keeping dependencies up to date wasteful?


If you're not affected by any of the bugs being fixed in newer versions, what do you think about updating dependencies periodically?


  👤 pacificenigma Accepted Answer ✓
It depends:

- Is it a reasonable time expenditure relative to other priorities (an indie dev doing an MVP is quite different than a 20 person team at a bank)?

- Are the dependencies "large" (framework level) or "small" (tiny, focused library)?

- Can you rely on semantic versioning to give a clue as to upgrade cost?

- How long since you last upgraded (further behind means much more breakage)?

- Does the language / platform you use make it easy (ie do you need to synchronize native libraries as well)?

- Can you depend on your build system to reliably test the upgrade and report stability?

- Do you have higher-than-usual dependency requirements (security, compliance, risk reviews, license review, approvals)?

Having said all that, I upgrade our dependencies every month. It only takes a couple of hours and very rarely causes issues (maybe 1 in 6 monthly upgrades requires an extra hour to identify a regression in a newer version and adding a comment with an issue tracker link to pause upgrades of that dependency until it's fixed).


👤 gshdg
You don’t have to update to every new teensy release immediately. But do update dependencies regularly (at least once or twice a year). Once you’re in a hole deeper than a year it takes a LONG time to climb out.

👤 seattle_spring
Absolutely not. If you neglect keeping your packages up to date, it tends to get exponentially harder to upgrade them when you do need that critical bugfix.

👤 peterbozso
You get more benefits from updating your dependencies than "just" bugfixes. There can be performance improvements or security updates in new releases. If you have a proper CI/CD pipeline and test environment(s) in place, it shouldn't be too much work. (Of course it highly depends on your domain.) Then the benefits clearly outweight the extra effort.

👤 eb0la
My CS teacher told me that in the Good-old-times(tm) they archived not just external libraries; but a copy of the compiler with the source code for any given release.

Still wonder how they actually did it because storage was expensive as hell.

Probably everything was backed up on a tape.


👤 jfrisby
What you're doing by keeping up to date continuously is reducing the cost of a major version bump down the road -- should that become required.

For example: If you had a Rails 3 app, and you updated to Rails 4 when Rails 4 came out, then to Rails 5 when that came out, you'd find the upgrade to Rails 6 is relatively straightforward. If, however, you stayed on 3 until 6 came out, then discovered you _really really_ need to be on 6 for whatever reason... Well, that's a nightmare of a project for any non-trivial app.

Each individual upgrade takes time and effort, of course but the developers and/or community of whatever tool/library/framework are absolutely gonna chart an upgrade path from (X) to (X+1). Going from (X) to (X+N), however can be exponentially more complex -- and you'll rarely find clear guidance / documentation / etc, if only because of the large number of permutations.

Now, why might you find yourself in sudden need of a major upgrade? A common situation is that a major security vulnerability is discovered that affects you -- but the version you're on is past the end-of-life for security updates. That's not the only possible situation, but it's perhaps the most obvious and broadly applicable scenario.

The worst possible time to have to face a potentially very challenging upgrade process is when there's a zero-day in the wild, and you're vulnerable.

So really, what you're doing by keeping deps up-to-date is mitigating the risk of winding up in that situation. This comes at a non-zero cost, of course. Hence your question.

The difficulty is that it's really hard to put a meaningful probability on whether/when you might wind up in a situation like that. I've started a bunch of companies over the >20 years I've been coding professionally, so I've gotten to see this play out in a variety of ways. I've had companies where it was never an issue. I also had one where my last-ditch effort to save the company was foiled by an upgrade path that effectively amounted to "rebuild everything from scratch". That one wasn't security-related and it was a (3D) game, so the circumstances there are probably pretty atypical for most folks on HN -- but it was a serious kick in the teeth for me.

Given that, I tend to assume the probability that I'll be in a position where I _must_ do a major, multi-version upgrade goes to 1 on a timeline of about 4-5 years. From there, it's just a matter of how effectively I can amortize those incremental costs. Patch-level releases? Automate the update (e.g. DependaBot). Minor upgrades? Lean in on anything where changes are mandatory (pretty uncommon), but don't block things up on fixing every deprecation -- those can be addressed incrementally. Major updates? Deal with that on a case-by-case basis, factoring in how major/breaking the changes are, and what are the circumstances the business is facing. Often this involves doing a spike of a _super_ rough upgrade, just to see what breaks.

I can't actually think of any circumstances where I'd look at the costs of incrementally keeping up to date and conclude that it was a bad idea to do so, although I confess that may just represent a lack of imagination and/or a bit of paranoia on my part.