Software is more complex than ever before by a wide margin, but for every Electron bloatware there's a ton of sleek Swift-driven amazement. And speaking of Electron, as maligned as it is, those apps are still Unicode-compatible, networked, and self-updating as table stakes.
There are some amazing old programs out there (have you seen the amazing engineering behind coreutils?), but things are most definitely not worse off today.
I see two main reasons. First, too much reliance on automated testing, which may catch crashes but does not guarantee good UX per se.
Second, the quality of recent talent in the industry. Too many people are in it for money, and could not care less about craftsmanship.
The macs I got to use in school felt inhumanly slow. I never got to use one of the fun colored macs but I heard those were a big step up.
Windows XP was a big improvement. Downloading and installing SP2 and SP3 was daunting on a 56k connection. Still, the OS was plagued by malware. I remember running various malware scans constantly.
Mac OSX on the first intel based macs was a massive step up from Windows XP. It converted me from a die hard Windows fan to a mac user. I still prefer macs.
For a while I used Windows 7 and it was ok. Better than XP and everything that came after 7, but still not that great in my opinion. One thing that drove me crazy was fixing fuzzy tex issues in Windows 7.
Not long ago I re-installed Mac OS 10.4 (Tiger) on my 2007 MacBook. I was surprised by how snappy it was compared to my 2018 MacBook Air, and even my M1 pro. To this day the computer I miss the most was my 2013 MBP.
I would argue that Mac OS kept getting better until 2013-2017-ish. Windows peaked at Windows 7. Linux desktops struggle to compete with Windows. The operating system play a huge role in user experience. Declining UX quality in operating systems makes the good stuff not as good and makes the bad stuff really stand out.
The incentives are completely upside down. The chief "innovation" of the past decade has been to shift software from being something that at least theoretically worked in the interests of its users to something that works in the interest of a burgeoning digital surveillance industry.
The field has always been ruled by a kind of fashion. But with Metcalfe's law plus the new mainstream popularity of the industry, the definition of "success" has become whatever has the most advertising money faking social proof for it. In other words, what is expected to be the most lucrative to exploit. Metrics now revolve around things like how much of users' time has been wasted, or whether they were tricked into buying something.
And to continue the landscape analogy - with remote attestation, software is now poised to be pushed into a canyon. Widespread computational disenfranchisement as corporations can wield software against individuals, but individuals will no longer have even an escape hatch to use software for cutting through that generated complexity.
The beauty of it is that a lot of golang devs at some point to decide to build pure golang libraries for performance gains (especially in the networking area I am working in) and I love it.
There's a lot of very well maintained libraries out there, and golang.org/x package is just the start.
It could be that there's just a lot more software now. I do a lot more on my phone without human interaction.
The biggest issue is the lack of alternative when software fails. When the computer says no, you'll have a really hard time talking to a human. You just get a really hard-to-find contact page that forces you to talk to a chatbot.
Software is also slow now. It's crazy how many delays modern UIs have. It also has more and more dark patterns that make you feel like you are used by the software and not the other way around.
FOSS is better than ever, but it hasn't kept up with commercial software on cross platformness. It's just too much boring tedious work that nobody really wants to do without being paid.
There's no FOSS alternative to Google Keep that has full parity, despite how simple it is, P2P has stagnated and been replaced with self-hosted stuff, which I have just about zero interest in.
But Jami, BitTorrent, and SyncThing are going strong, so p2p isn't quite dead yet.
There's not much of a quality difference between say, The Day Before and King Kong the Rise of Kong vs Big Rigs and Superman 64. They're all poorly made trainwrecks that fail miserably at doing what they set out to do.
And the same goes with most software. CMS systems and things for web sites? There are about as many great and terrible examples now as there were during the 80s. Same with desktop software, mobile apps, web browsers, etc.
The thought that things are getting worse likely comes from the nostalgia filter, and how terrible products/art/whatever tend to get forgotten by history. Over time, the only works that stand out are those that are either amazing (and seen as classics to this day) and complete disasters that gain a certain amount of infamy. So you'll see the standouts compared to the flood of mediocrity around at the moment, and think the olden days were better. There was probably about as much mediocre crap released then as there was now, it just kinda got forgotten like most mediocre things are.
Small apps are 300-500mb where they would have been 3mb .exes in the past. This bothers me.
subscriptions bother me. as do adversarial business practises (the entire 3d drafting landscape is fucked for example...try to find one company that charges decent prices for hobbyists + doesnt involve the cloud, make everything difficult)
However software is more complex than ever, because what we expect to do with it only gets more and more complex.
the scope of software has exploded.
Efficiency has certainly got a lot worse because it matters less on better hardware - but the pilling up of multiple inefficiencies can still makes things sluggish.
With the other too I think it maybe that we have failed to make things better than actually make things worse. Software quality has stayed the same but feel it more because we are so much more reliant on software now.
It is easy to find examples of crap code out there, as they tend to stand out. But we take for granted all of the good stuff.
Developers these days are far, far better resourced in almost every way imaginable, and the ever climbing salaries of the industry have pushed them to make good use of those resources.
Much better to ask if the quality of your software has gone downhill. If you ask and honestly answer that regularly, you will end up better for it, I promise.
I think it’s a bit the same with music. Music didn’t used to be better, but we remember the best music and forget about the forgettable music.
At the outset of that, the predictions were simply that software wouldn't get built because it was getting too complicated and expensive. The 80's saw a lot of developments around programs using module systems, dynamic linking, etc.
But the software being shipped at that time was still mostly either "appliance" or "back-office" software: data came in through the keyboard, and left by the printer. Most memory discipline revolved around static allocation and overlays, and the microcomputers were not doing a lot of multitasking(although it did come in with the 16-bits). There were more interesting things happening in big firms and universities, but the hardware was also specified around the type of thing they were doing and there was admin staff physically monitoring those shared machines. There was a lot of software that, in Unix "do one thing well" fashion, could do very little, but also did it reliably. And there was a lot of eye-wateringly expensive commercial software of this type: thousands of dollars for a compiler or database.
In the 90's, software got uniformly bad for a while because the PC became so much more powerful in such a short time, and the approaches that worked before turned into a moving target of "plan for the machine spec of 18 months from now". Wintel machines just weren't made to do the things they became capable of, and you had little way of knowing whether your crash was the application, Windows, or your drivers. Macs were no better.
There were really three things that crept to the forefront in this period:
1. The hardware being commoditized, but not open. To this day, Nvidia wants to control their drivers. And as long as we keep buying from them, they can dictate part of the software stack. They are hardly alone - every big player knows the game.
2. Essential complexity being addressed with accidentally complex protocols. For example, everyone uses UTF-8 now. It addresses a significant issue in a reasonably good way. But you still have a lot of systems in the wild using UCS-16. USB is designed to do everything, which gives it a high "floor price" for a system implementor compared with the classic parallel/serial mechanisms. The web browser ended up with Javascript. And so on.
3. A drift towards financialization within software. The "dot com" hype of the late 90's was what it was because the VCs had found a formula for getting companies to IPO with no revenue. When shrink wrap software became a business, it had lots of competitors, but by the 90's consolidation meant there was one "industry standard" per industry(Microsoft, Adobe, Autodesk, Oracle, etc.), and it became more interesting to target consumers with Internet appliances. This project of building the ultimate consumer platform describes most of the past 30 years or so, in different phases and across different facets.
If it weren't for open source it would not be possible to host so much complexity. Kicking things down a layer to a dependency has been the way in which the software crisis has been handled, but a lot of the hasty or accidental standards are still standard.
yes.
> If so, why?
CADT. Fixing bugs is much harder than a complete redesign. Addressing existing problems is much harder than implementing new ideas.
Also, I think that bug-free software never existed... hopefully one day it will, not too hopeful though for it to happen in my lifetime.