HACKER Q&A
📣 keepamovin

Is it getting harder to write software over time?


As a "fun weekend project" I decided to try to build DOOM on MacOS (M1 2020). After install xQuartz, and removing -m32 flag from Makefile I built it under Rosetta, only to Segfault on startup, issue in r_data.c line 541 (with fsanitize). Yet the whole compile process happened in around 5 seconds. I thought about digging in more but couldn't be bothered, so I went looking for a port. Found chocolate doom and installed chockpkg. I started the build 5 minutes ago, it's still configuring the make file "Checking for..."

It struck me that, maybe, it used to be easier to build software. But it seems the reverse should be true. I say it should be easier for me to build a DOOM source port now, on my M1 mac, that it was to build many years ago. But it seems like it's not. And if it's harder to build, then it's harder to write. The original DOOM source is tight, but chocolate doom (no disparagement) was a multi-megabyte repo pull.

Is this a general thing, or am I overfitting my 1 experience trying to build ancient software on my modern hardware? :)


  👤 T-A Accepted Answer ✓
> if it's harder to build, then it's harder to write

I think the opposite tends to be true. You're dealing with "a multi-megabyte repo pull", because of SDL2 and other dependencies needed to build Chocolate Doom.

Writing on top of SDL2 etc is obviously easier than having to implement everything from scratch. It makes building more cumbersome and will usually yield a more bloated and less efficient end result than having John Carmack hand-optimizing everything, but it lets you get the thing working well enough - on modern hardware.

Back in the day of original Doom, when every clock cycle and every byte counted, and ready-to-use libraries were few or non-existent, writing was much harder.


👤 superchroma
I've used Visual Studio 2008 through to now. Things are better now. Web tech is a tire fire, but it always has been a tire fire.