The examples of how to properly integrate Lua into a larger commercial engine have been reused by my team for over a decade and used by larger teams who are on Steam as well as hobbyists.
We wrote a game engine called Planimeter Game Engine 2D that's also over a decade old which has tons of reference code that's easily accessible for game engine engineers and game developers that has advanced concepts like high tolerance client-side prediction down to dial-up connections and a UI compositor with CSS 2.1 features.
My software firm Andrew McWatters and Co., has an internal codebase called myapp which is also many years old and serves as a base for delivering web services to clients leaning on experiences from dozens from businesses including some in finance and healthcare to deliver solutions compliant with federal standards.
Good software lasts longer than bad software because it's useful, and doesn't chase trends, it defines them or stands apart in ways no one else's software does.
Somewhere around 2010, they contacted me because they had a problem with it when they replaced the computer with a newer system. It turned out they had the parallel input cards swapped, and it was put back in service. As far as I know it's still in service.
It's quite possible that thing will outlive me. Why? Because it's tied to hardware that works and does economically important work.
I wrote it in a weekend, just as a one-off while I was learning Node and procrastinating my other studies.
Today, it gets just under 30k weekly downloads. Which isn't a ton, but it shows that it's still being used.
After I got a real job, I found it in an internal third-party repo. Apparently some contractors built a site with it years ago.
I ran into my old boss, who was a family friend, and he told the story that they used that as an example to improve other stuff, and that had saved a few million bucks over the years (iirc they paid for CPU cycles). It was still running as of 2020!
The bench technique was not as repeatable as I needed to begin with, but the easy calculations were identical to how it had been done using slide rules, only electronic calculators were the modern replacement.
The fourth decimal place really maks a big difference when you've got thousands of tonnes.
There was one caustic plant where they had a very specialized lab apparatus that allowed better precision, but we were not going to be able to adopt that approach worldwide in our independent labs.
I figured out a way to easily reproduce the results using ordinary apparatus, but the calculations were far too hairy for most bench operators to adapt to. And that was with calculators, nobody would have ever done this with slide rules.
So I wrote a little Basic program on the Commodore VIC20 that we had in the office just to experiment with. With 20/20 hindsight I was the first and only programmer at this particular multinational at the time.
Then it was a breeze to just assay the easy way on the bench, type in the readings to the VIC20 and out came the result.
Ended up leaving the company to build a lab in another port before this could be deployed elsewhere, did configure the code for a few different platform Basics before the IBM PC became so common, and used it myself ever since whenever I need to break out the big guns to settle a dispute or something.
One of the reasons I think it lasted so long for me is because it solved a problem that existed many decades before I came along with a systems approach that included code for the first time in history.
Also if you drive a Mazda RX-8 (J64/J61J/J61A internal name) from 2007 that has immobilizer feature (back then it was an opt-in feature, nowadays every car has immobilizer by law) then the code running immobilizer feature was partially written by me between 2005-2006.
The reason for the longevity is the focus on extreme performance by focusing on I/O by utilizing the database and parallelism offered by the platform. Also, our coding language choices were pragmatic and are still supported. So, even with all the db and platform version upgrades, the code still runs.
I think they are finally going to put it to grave, after a long discussion / brainstorming about AI and the impact on their main product (now no longer a CMS, but a low-code platform for SMEs).
It is a scheduling system for a department at a major public hospital in the area.
It’s still one of the projects I’m happiest with. Despite subsequent developers making a bit of a mess of it, it is a nice piece of software. And, compared to the rest of the stuff in health system, is it a shining beacon of hope for generations to come.
As I’ve progressed and started working for some bigger companies, I’m incredibly impressed by what we all, myself included, did with so little.
My oldest code that I myself still use in production is probably some Python web services first prototyped around 2009 and then refactored several times including the port from Python 2.x to 3.x, which we deferred until 2018-2019. It is still in use because we keep finding use for it in funded projects. The Python language and hosting environments have been stable enough to keep it running with minimal externally-forced maintenance.
I might have had some older shell scripts that were in personal use for longer periods, but I've found myself retiring them one after another rather than trying to maintain them anymore. This included lots of my own "paperless office" kinds of tasks to acquire, convert, display, or otherwise manipulate various document and image files. I also used my own custom backup tools and email replication strategies for a very long time before begrudgingly moving to contemporary tools.
With other open source contributions, it's harder to say how long something lasted. Some of my C code ended up in grid computing libraries that might have also gone for almost 20 years, but I suspect it fell into disuse at least 5 years ago.
I also helped arrange some work on SSLeay, which later forked into OpenSSL, but did not author code changes myself.
Through my job, I was a bleeding edge user of a lot of Linux features and tried to provide good bug reporting, but did not contribute as an author. This included early SMP support, large file support, kernel-level GPU support, and various wireless and wired networking drivers.
My earliest open source engagement was as a beta tester for XFree86 2D drivers around 1994 while in college. I did a bit more of this with early 3D driver testing at my first job. I may have contributed some register dumps and other hardware info, but still would not say I authored any of it. Also as drivers are hardware specific, I don't know how long it took for my efforts to fully dissipate.
it has literally 1 user which finds it perfect and don't want it changed in any way, and I actually think it's mission critical at this point.
I think I have a phone call to make next week...
It's a "single page app". Using jQuery. Python backend.
I truly hope nobody else ever has to work on it (now there are real single page libraries) but it still works and has been going like a charm over a decade and the users like it.
Every year I think maybe I'll redo it using Vue or something but never do.
I helped rewrite FIS (RSTS) as XENTIS (VMS) for Park Software in the early 1980s and worked on it for over a decade. It lasted as long as VMS did. This was the prototypical "report writer" / wizard.
I wrote a logistics system for a nonprofit on donated VAX hardware. That lasted a decade before Y2K doomed their accounting system (VAX don't care about no steenkin' Y2K!) and both were replaced together.
I wrote a procmail guide (documentation) and a web page with a javascript-based .procmailrc wizard that rattled around the internet for a while.
I've got various pieces of automation which have been ported to different systems for as long as two decades (I don't think I have anything currently running which is older than that).
I invite you to view the repos I have pinned at GitHub: https://github.com/m3047/ although none is older than 2019.
I seem to be building a command-line based federated SIEM leveraging DNS and Redis. At this point I don't know if it will ever see the light of day in terms of wider availability, haven't found anyone else who's onboard with this level of sedition (yet).
I've built other software which was popular for a short while.
I'll offer an honorable mention for http://twiki.org/ as I used to curate a distribution. I'm still personally running the version based on the Dakar release from 2006. I've patched a couple of security issues and made some small changes due to changes in Perl in that time.
So what makes code last?
It's not a beauty contest. It solves a problem, but you can solve the problem without it. It doesn't sit astride a shear layer. It inspires confidence in terms of predictability. It offers useful internal telemetry or other mechanisms which allow that confidence to be tested, explored, or extended. Oftentimes it doesn't solve THE problem, but is composable to solve various problems. It doesn't chase trends (XENTIS pretty much filled out out every useful feature we could think of in its niche, but it stayed in that niche). It might be personal bias, but configuration is declarative or at least machine parseable and accessible (so you can compare configs over the last decade or more).
Next up would be random ETL scheduled processes in back-offices.
I helped familiarize a new employee with said code.
I wrote that code before they were born.
Hard to beat php and c++ when it comes to that. Write once, run anytime for decades