It feels like most companies I've worked at are terrible at maintaining software over any reasonable period of time. This month I've worked on PRODUCTION applications using: .NET Core 3, Node.js v6, PHP 5, MySQL 5.7 and that's just off the top of my head. I've worked at some companies that have even been hostile to the idea of updating to a supported version of something saying it added no value and was just churn for churn's sake. Perhaps some of that is true, but it just kicks the can further down the road.
What I'm wondering is: How common is this? Is this one of the reasons (I'm sure there's many) why I get letters every couple of months saying my data was leaked/breached from some company? How do you convince management to keep up with the constantly shifting and evolving software world when they see little value in it?
It's not mystery as to why, either. Upgrading tools is expensive by just about every metric, not just financially but also in terms of disruption. Also, using the latest version of anything carries a real risk: the latest version is generally the one that is most likely to contain serious problems. It's better to sit at N-1 as a general rule.
Deciding not to upgrade, in the absence of a benefit that can offset the cost, is not entirely irrational.
- find the source online
- get the build going
- fix the security problem
- deploy to local artifact repo
- keep on trucking ...