I cannot help but wonder, that we have possibly screwed ourselves pretty bad, and there is no escape from it. The vocal minority tries to push these overly complex solutions down everyone's throats, and management loves this, because it creates "work" for the sake of it, but it doesn't add any real business value.
What are your thoughts on this? Will industry move towards simple solutions after experiencing this churn down the line or are we doomed forever?
Of course, there are plenty of project where judgement Is thrown out the window in favor of adding a buzzword to everyone’s resume. I’ve heard it called “promotion-based architecture”, as in you pick the technology most likely to get you promoted. (If that works, it says all sorts of not great things about your organization).
Regardless, I don’t think the availability of tools is the root problem. It’s and industry-wide lack of emphasis on identifying and understanding the problem first.
With software, the problem is even greater. You can use react, for example, but you will probably start with create-react-app, which adds a lot of things. Or you could start with Next.js, which adds a lot of things. You could use Java, but you will probably start with Spring Boot, or Dropwizard, which adds a LOT of things. Plus all of these starting points imply the use of new languages, configurations, and programs, in particular, builds.
In my view, all of these "Bisquicks" represent experiments-in-progress, with the ultimate goal of the systematic characterization of "software application", in general. In other words, they are systems of alchemy, pushing toward being a system of chemistry, which we don't have yet. So it is bound to be a confusing time, just as it was surely confusing to play with chemicals before Lavoisier systematized the practice.
If you're using python for the web you're already part of the complexity problem, atleast from the perspective of someone deploying php 15 years ago. I use python for web development, and I love it, but deploying webpages used to be copy an apache config and ftp/scp your files to some location. Now we need app servers and reverse proxies, and static files are served differently, and even though I've gotten used to it over the last decade it does't mean it's good.
The other thing is that MonoRepos are pushing back against complexity for the sake of complexity. Why create a new repo when a directory will work just fine? I think a ton of people got repo crazy because their coporate jenkins server only allowed one project per repo, but it is trivial to check the latest git hash for a directory and base your deployment on that. ...I have a project I inherited that has 14 repos for a webpage with maybe 5 forms on it. I've mostly handed it off at this point, but everytime I have to look at it I end up preaching about monorepos for weeks.
I think you're just in a very negative space if you start with "Distributed systems" as something overly complicated. At some scale getting a bigger machine either doesn't make financial sense or is just not possible to implement efficiently. Some ideas are taken too far or implemented where not needed. But I'd rather recommend you to learn where each one of them started and why. Criticize for valid reasons, but don't become a curmudgeon.
But has being a software engineer become easier or harder over the last 30, 20, 10, 5 years? I wasn't an engineer for that long but my impression is that programming today is a lot easier. Dev tools, compilers and linters are very good. There's also a lot more community documentation on stack overflow. Some of the complexity is hidden from the developer, which is good and bad. It can bite you in the ass later, but in 95% of cases its a good trade off in my experience. For instance, my preferred stack is Serverless on AWS. I can set up a single config and have cloud storage, an api, a database, logging, auth all permissioned and in a file I can check in. And with a generous free tier, it's pretty much free. I'll admit if something goes wrong its not fun to debug, but it's remarkably fast and simple for me to spin up a CRUD api.
Here's a walk down memory lane for you about rewriting apps, circa year 2000: https://www.joelonsoftware.com/2000/04/06/things-you-should-... If you replace names and versions of things with "Go", "Rust", etc. it is pretty much what you describe.
The dangerous spot is engineers with 5-10 years of xp who have become good enough at writing huge piles of unecessary code and have them work.
A refreshing experience was a mobile app Apple device, with Swift and Swift UI. It was a real joy, works as expected, produces concise code, small files, live preview and reasonably fast build time. Sure, it's closed environment, but last time I felt so productive doing UI dates back to Visual Basic.
Counter-example: a simple web app, nothing fancy, and my node_modules filled with around 500MB of files, hundreds of declarations of injected things everywhere.
But nobody forces us to use Kubernetes, nobody forces us to climb the Rust learning curve, nobody forces us to use this multi-platform framework that solves all the problems of the universe.
I try to stick to standard solutions, oft proposed by the vendor: Kotlin on Android, Swift on Apple, c# on Windows. Server code: stick to Java, or try the simple Golang (another refreshing language).
Also, I try to stay late to adopt tech, just starting to be confident in Docker and will see in a few years if Kubernetes could be useful.
But, an architect needs complex solutions to justify its job, a team lead needs as many dev as possible to tell at the next family dinner, and the new dev wants to try this new fancy tech to put on his resume. So they are all fine with that. Just don't tell the company ownership.
1. This industry absolutely has, and has had for a long time, a problem with "oooooh, shiny!" chasing. We collectively obsess over using the latest and greatest, newest and shiniest, "sexy" technologies of the day. And sometimes (often?) this obsession overrides good judgment and we try to use this stuff regardless of whether or not it's actually a better fit than something older and more prosaic.
2. However, sometimes the "new, shiny" is actually better, for at least certain applications. And we should always be willing to use a newer, better, "sexier" tool IF it actually helps to solve a real problem in a better way.
Unfortunately (1) often seems to trump (2) and we get stuck using the "newest and shiniest" for no particularly good reason, other than the simple fact that it is the "new shiny".
I have no expectation that this trend will ever abate.
Even if you don't believe it's a conspiracy, you have to admit that: that dynamic favors incumbents, the big frameworks often come from the big incumbents, and if you are a big incumbent even if you weren't manufacturing this conspiracy but you saw its possibility I mean why not take advantage of it?
The outlier is the scrappy indie developer like Peter levels who runs his multi-million dollar a year business on his own basically on a single machine using PHP and only recently started using git. That may be an extreme example but it paints a picture of what radical effectiveness and efficiency looks like and it's vastly different to the Kool-Aid but don't mention it otherwise the mob will come for you.
May the no-code & indie scene save us all. Amen
You should know that in the past when OOP was not common, we had to work a little harder doing things like managing our own memory or building a LAMP server to publish our web pages.
There was a thriving market for language and UI add ons. The result was that each company had their own internal dev tools and recruiting people outside of the company who had experience with those tools was nearly impossible.
All that said, we were at a point where entry to programming was easy (think Visual Basic in the 90s). The quality generally went down as everyone was pushing their "first project" as if it was a polished product. Finding actual good programs on PC is close to the situation on mobile where most of the apps are trash.
But this is where the senior developers and architects should come in; they need to make a firm stand. Make things like https://boringtechnology.club/ mandatory reading. Have people that want decision power write ADRs so they are made to explain the context, the problem, the possible solutions, the tradeoffs, and the cost of picking technology X to solve problem Y.
It's too easy to bung in new technology or change things around; the short term, initial cost is low and the long term cost is not visible, because it quickly becomes "this is how things are done around here". Make it more expensive to change things.
And make people responsible for these things. If an individual advocates for technology X, they have to train and hire for technology X as well, and be responsible for it long-term. Learn to recognize a "magpie developer", the type that will always go for the latest shiny; you can use those to introduce new technology, maybe, but keep them away from your core business because they will make a change that impacts your product and your hiring, without taking long term responsibility for it.
anyway, random thoughts.
There’s no we. There’s a million moths slapping into everything and being drawn to various bright, often false, lights.
What you have to do as a developer is try to keep up with the hundred new things, possibly dabble in them to see what they are, and decide for yourself how much effort you want to put into that particular "thing". You have to use judgement or you will burn yourself out.
I never bothered with Pascal. I learned enough Java to be dangerous, but it didn't really apply to my problem/solution domain. I did learn C++ and also learned to distinguish between when a solution looked like an object (C++) and when it did not (ANSI C). If anyone tells you to 'always use C++' ignore them.
I learned Perl, because I found it more useful than AWK for large problems, but AWK still reigns supreme for 'one liners'. Then I learned Python, and discovered that the problems then fall into Python or AWK. I rarely use Perl anymore.
I tried my hand at Go. I don't find it very satisfying. I am looking at Rust.
Everything else I have ignored, by virtue of choosing what problem domains I am interested in solving.
So that software engineers are not screwed, not by any margin. Just choose the problem/solution domain that you want to work in, narrow down the tools you want to be competent in, and move forward. Try to avoid the 'Look! A Squirrel!' response mode as much as possible, but do poke your head up to see what the world is doing on occasion. But be aware, a lot of it us useless noise.
Intelligent animals need stimulation or they get bored and depressed.
I think collectively, "let's move to Rust" is at least partially because we're not challenged enough by writing the same CRUD app for the 20th time in the same language we've been using for the last 5-10 years, and we want to leave our mark in a new ecosystem by implementing whatever is missing.
Some people want to optimise for "fun/exciting/different" while others seem to be aiming for "known/just works, incidentally boring".
We probably need to find the right middle; how do we keep it fun and challenging while keeping it simple and maintainable.
* All these new tools, they give us options, no? Use the right tool for the job, the ability to switch if something becomes old/unmaintained.
* Is this actual complexity or perceived complexity given your experience? The node ecosystem looked very complex for me (someone coming from Python) until I actually got into it. Now it seems pretty run-of-the-mill.
* Is k8s really all that hard? Build a container and you don't have to worry about provisioning it and deploying it again.
There may be good reasons to use some of the technologies you pointed out. And that's a strong may because I can easily come up with arguments in the other direction in addition to yours. I say all this to mean you just shouldn't dismiss it because it seems hard. It may be and it may not be, and if it is it may still be worth your time if the payoff is great enough. There you have to do the legwork to figure that out.
Unnecessarily complicated is the default. Choose the elegant thing wherever possible (its not always possible, but often is).
Actively avoid complexity, or it will shackle you.
You sure about that? Sometimes the seemingly simple problems are quite complicated. Partly because we are building software for a world that is fraught with (security) landmines.
But point taken, sometimes you can overcomplicate the architecture.
>Distributed systems? Kubernetes? Rust for CRUD apps? Blockchain, NoSql, crypto, micro-frontends and the list goes on and on.
Each of those are particular tools for particular problems (though I'm not sure why Rust for CRUD apps is so terrible).
>moving away from python (because its too "slow");
Not only is it slow, but the lack of compiler support for typing leads to an inordinate amount of (stupid) runtime problems. I say this because I recently inherited an entire inventory of python software built up over the years at my current employer. Right now, I have a bug backlog full of runtime blow-ups (dating back years) because of careless typing. Coming from the unsexy world of C# and Java, still trying to see why Python would ever be used for anything but scripting and (maybe) prototyping - it's slow as molasses and no compiler support.
I work at a foreign language instruction firm. I'm making a virtual reality training environment for them. It's the best job I've ever had. I don't have anyone micromanaging my work, because nobody understands my work. I barely understand their work, and that's ok. We understand that about each other and we actually collaborate.
In the last 3 years I've not once been yelled at, talked down to, berated, cajoled, pressured into working overtime, any of it. I've not seen it happen to anyone else, either. I have an office of my own. I can work from home whenever I want. People just trust me to be an adult and do my work and it's the greatest thing ever: basic human decency.
It’s a peculiar feature of human nature that we want to make things more complicated than they need to be. The more something relies upon a combination of our skills, and the more esoteric those skills, the more insulated that thing is from outside influence, ownership, and control.
My bet is the frustration you feel is less about complexity and more about your inability to affect change. You’re just one of many competing solutions to the same set of problems, and people will think your ideas are just as complicated because they’re not their ideas. They understand their own ideas better than they understand yours. Vice versa.
And we all live under this umbrella, together. I think that’s why the biggest asset you have as an engineer is to influence people who make decisions. Unfortunately, the best way to influence them is to convince them you have important, complicated knowledge they don’t. Self reinforcing loop.
It won't change until we can form a guild (professional association) and turn it into a bonafide profession. Right now, code that one developer creates may be unrecognizable by another developer, even though both are working in the same domain. It would be a disaster if one lawyer could not follow a brief written by another or a doctor could not decipher which techniques were used by another to perform a particular surgical procedure.
"Just because you can drive a car with your feet if you wanted to, doesn't make it a good fucking idea!" --Chris Rock.
A union doesn't have to be a huge monstrosity. It can be simple and fight for a few basic standards in the industry.
“Doesn’t add business value”
But do you know how the business makes money (the actual processes)? Can anyone tell you how to add value in concrete terms?
Because in over a decade of consulting on technical leadership, Agile, lean and DevOps, the most consistent issue I’ve seen is that those questions are unanswerable for almost anyone in almost any company.
In the absence of a clear path to value creation, everyone optimizes locally for “best practices” because…
the root problem is almost all decisions have to be explained to people who know next to nothing about your area & you need to still sound rational.
The local maximum for that usually is “this is how _____ does it & it’s the new trend now.”
The industry trends towards the most useful solution, not the simplest one. React isn’t internally simple, but it killed the frontend JS framework experiments which used to come out daily because it really established a useful paradigm that covers a lot of the web GUI usecases.
The process is messy but it’s not illogical
I'm pretty good about shutting those types of talks down in my own org. Usually when "slow" is mentioned you have to take the presenters word for it. Rarely do they include metrics. And if they do once we delve into the code it usually becomes obvious why something is slow. Usually "slow" comes from using bad abstractions.
But companies that need those tools, really, really need them. We don't use "nosql" (hate that term, it is a really dumb term anymore, sql use or not is completely orthogonal to the problem, "non-relational" or "non-OLTP" is better) databases because we think they are cool tech, we use them because traditional, relational, OLTP databases don't work for our use cases. But if someone comes to me and asks what database they should use, I always say "postgres", unless they can present a compelling reason postgres won't work.
The problems we face are tremendously complex, though, and only getting more complex. We fight back with tools, but there are years where the tools are failing to keep up.
Just one example, Scala. First, I'm not criticizing the language itself. It has it's place, but what I saw was programmers trying to create a protected space that would provide higher bill rates. Java was everywhere and hiring a Java developer was easy. Scala was new and had steep enough learning curve that you could drastically shrink the candidate pool while at the same time selling the shiny new toy to management. They could create complex, arcane code that kept new developers from getting up to speed while providing the excuse that they were inferior developers and weren't smart enough to keep up. It didn't work for very long as management caught on that they weren't getting much other than higher labor costs. Go seems to be the latest incarnation of that while Rust is a bridge too far to sell to management.
So it's this back and forth, provide something to management that they can sell to their superiors something new. Management buys into it as long as they can get promoted before it inevitably blows up and the developers who sold it move on to new projects, rinse and repeat.
The only way to counteract this natural course is to explicitly and continuously take the time to simplify and consolidate things and to bear the extra cost of that continuous effort. But the incentives are stacked against that. As long as it (barely) works, the one taking shortcuts and increasing complexity, or just adding something new, will have an edge. It’s also much easier to create yet another leaky-abstraction layer on top of an existing system than improving the underlying system, because the latter is already in use by too many parties and the necessary changes cannot be done without breaking compatibility.
Another factor is that the field is still learning (e.g. type systems, how to best handle concurrency, distributivity, etc., not to speak of changes in the hardware having an effect on what works best, e.g. cache locality, parallelism, GPUs, etc.) and to some degree is still in its infancy. Maybe at some point in the future we will have it all figured out and reach a point of stability where we can concentrate on just making everything as simple and coherent as possible in the ways we by then know work best. But maybe not, and certainly not within our lifetimes.
So, yes, for the time being I’d say there is no real escape. But you can probably find a niche where things are calmer and slower, and stay away from the areas that are the most crazy and quick-moving.
I don't think this is a bad thing at all. Every time we learn things, every iteration software improves. The pendulum swings back and forth -- mainframe to PC to server-side to browser based apps to whatever's next, and every one of those offers benefits to what came before.
I think the industry is waiting for AI to come through. They want the business analysts to be able to write their specs in English, and have the AI do the coding. In such a scenario lots of developers will lose out - some will still be needed - but from a business perspective, this will be even better than outsourcing.
"Back in the day" you really couldn't get things done if you didn't actually know how things worked. Today, you can do a little learning (which is still a dangerous thing) and based on low quality requirements create a buzzword-friendly applications with 1000 dependencies you neither know about nor have to check for. That was however just a side-effect of something that is a good thing: composability.
But that dives into the technical side of things, in reality, the marketing of technologies as a product and the involvement of human middleware (management) in things they have no business being involved in causes most of the perceived problems. That is not something that is really caused purely by software engineers, nor can it be solved by just them.
Two ways to go about it could be:
1. Having all the human overhead go through the same requirements and QA process as everything else
2. Be better at marketing your own solution (but make sure it has the correct technical and business underpinnings)
This doesn't work in legacy hierarchical work environments, and you're essentially just screwed if you are stuck in one of those. Best to either stop worrying about the technology in one of those situations, or move on to somewhere else.
https://news.ycombinator.com/item?id=31217253#31240227
It's roughly 3x the complexity and labor compared to the 1990's desktop-oriented IDE's like VB, Delphi, PowerBuilder, Clarion, FileMakerPro, etc.
I realize deployment (installing, updating) was harder compared to web apps, but I'm not sure it has to be either-or in terms of simplifying deployment at the expense of development. Oracle Forms seemed to do the CRUD job sufficiently without installing a new EXE for each app update. It seemed almost a "GUI browser". A state-ful GUI markup standard may help us get closer to that again.
OF was not perfect, but we should've learned from what worked well and improved upon. We threw out the productive baby with the bathwater.
We have over-focused on social media and "web scale", but ordinary CRUD still does most the real office work. Making our apps "mobile friendly" has crippled them, despite the fact most real work is done with mice. It's time to return to YAGNI, KISS, and real GUI standards. It's not about nostalgia, it's about NOT excepting the waste and bloat our current dev tools now have. "Hello World" has a zillion lines of code behind it now.
It is not necessarily fair to say that the majority of software engineering jobs actually require or involve the en vogue tools.
1. Just because tech stacks gain traction in headlines does not mean that they are truly mainstream, but rather that they are of significant interest to the community where the links are submitted/discussed.
2. Recruiters and job ads are written to target software engineers and are gamed towards this goal, dropping buzzwords left right and center, sometimes quite nonsensically. Front-end jobs quite frequently demand that you have experience with Angular, React and jQuery to work on something that turns out to be a Vue.js app, and so on. So this can also make certain tech stacks and frameworks appear more prevalent than in fact they are.
So, yes, there are lots of overly complicated tech stacks out there, but no I don't think anyone is screwed. Often those tech stacks will have been chosen to solve a specific business problem and then it's not overly complicated, it's appropriately complicated.
If anything, there's just more noise to filter out when selected a place to work. Lots of buzzwords and nonsensical jargon dropping, or indeed questionable decisions for the solution of a relatively simple business problem, are good indicators for places you at which you probably shouldn't work.
1. The cloud allowed to increase the available computing power at the expense of simplicity. This brought in the whole devops suite of problems (kubernetes, microservices and what not).
2. The data science hype brought in Python everywhere, which creates contention both culturally and technically.
3. The rise of mobile, means you no longer can escape portability.
4. And then general hype about the last new thing. I don’t that changed fondamentally.
I think things will get better eventually cause 1, 2 and 3 are still relatively new.
You referenced a 500 line Python script being refactored with Rust and make me think of the Polars project: https://github.com/pola-rs/polars
Polars uses Rust to make DataFrame operations lightning fast. But you don't need to use Rust to use Polars. Just use the Polars Python API and you have an elegant way to scale on a single machine and perform analyses way faster.
I'm working on Dask and our end goal is the same. We want to provide users with syntax they're familiar with to scale their analyses locally & to clusters in the cloud. We also want to provide flexibility so users can provide highly custom analyses. Highly custom analyses are complex by nature, so these aren't "easy codebases" by any means, but Dask Futures / Dask Delayed makes the distributed cluster multiprocessing part a lot easier.
Anyways, I've just seen the data industry moving towards better & better tools. Delta Lake abstracting all the complications of maintaining all the complications of plain vanilla Parquet lakes is another example of the amazing tooling. Now the analyses and models... those seem to be getting more complicated.
In the cloud if you want to run a performant platform you typically can also run it much cheaper if you migrate away from maintaining actual systems.
The problem is that the DevOps and system engineering jobs have become much much much more complex in order to accomidate the cloud and as a side effect developers now have to meet them halfway as the line between the two blur.
If you want to run a product that processes a million records a minute you are likely going to want to go serverless and that means writing atomic lambda operations. We are shortly not going to live on a world where you can just do all this on your laptop which will be good in some ways and bad in others.
You will never have to worry about environments anymore you just write code against the aws,Google,or azure SDK and it will run on an obviscated identical system you are never aware of...which also has it's pros and cons.
You are right for most companies. Normal SaaS products need to get over themselves and realize kubernetes might not be that useful but this complexity exists because the larger companies were having trouble maintaining the old way of doing things at the scale the world demands. As long as millions of new users adopt the internet every year this complexity is only going to get worse. The world of 2030 doesn't exist without kubernetes and rust and lambda imo...for better or worse it's going to keep getting complicated.
small companies copy their technical and even hiring decisions from behemoths like Google
why? market powers!
the unfortunate reality, is that these companies can’t compete elsewhere, so they use hype technology that allows them to better market themselves (on the said conferences for example)
the employees can also use this opportunity to put “managed Kubernetes cluster” on their resumes to get more job offers
solution for you would be to find a company that doesn’t focus on technology, but on the problem itself
Using Go or Rust instead of Python is not inherently more complicated, it's just a different language.
NoSQL is not complicated but it's fairly useless for most of its users (despite being so popular). At the same time, it has its uses for companies that need massive scale (think Google, not your average startup).
Kubernetes is fairly complicated but it can be the easiest option (even if it's not the most resource efficient) to do something because of the ready made tools available for it.
Don't worry anyway, we haven't screwed up ourselves, we just created tons of artificial work we can spend our employers' money on and that we can use to inflate our cvs and possibly land some more money in the next role.
When you build your own company, be conscious of this, and just use jQuery and PHP like Pieter Levels does.
I think the problem is that things we're building stuff to solve specific problems, and then expanding each of those tools until they become massive and need other tools to help them. So docker solved a problem but then it created problems that you need kubernetes for, and so on.
One of the reasons I'm working on darklang is that I think the root cause of this complexity is solvable. The solution, in my opinion, is to build tools that cover multiple layers of the stack - that removes join points where you might be tempted to customize.
For example, firebase covers multiple layers, you might otherwise need a DB, a connection pooler, a firewall, an api server, an autoscaler for the api, a load balancer, etc. But instead, the only surface area you have is the firebase API. There's lots of similar tools that cover multiple layers of the stack like this, netlify, glitch, darklang, netlify, and prisma are some examples.
A massive amount of companies, maybe even the majority, don't do this. They use what works and upgrade when needed, not when its cool to use the new thing. They just don't tend to pay like trendy companies and don't look as good on a resume as trendy companies.
I don't think engineers are willingly screwing themselves. Does anyone here choose to adopt something they know will screw them over? We may be forced into decisions by higher-ups or by colleagues or associates, but those people generally have some reason behind their actions, they don't willfully screw engineers for fun.
The field as a whole, none of us individually can control where it goes. If your org sticks with proven older tech, it will do zero to prevent new frameworks from cropping up everywhere else. If you adopt any newer technology, you're now becoming a user, increasing its relevance, helping to test it and prove it, finding bugs and errors.
So no, "we" have not "screwed ourselves". It's simply human nature to complexify and add more tools over time.
There are a few problems.
First, the gray beards that expect everyone to know "the basics" had built stuff so complex and convoluted that nobody can use it 100% correctly without their domain knowledge. It's fine, computers are complicated, but expecting everyone to keep it all in their heads is unreasonable. So people buried that stuff below a layer of abstraction, but that didn't solve the fundamental problem and so even these higher level tools are convoluted and cumbersome.
Then, you've got the people doing this on purpose to ensure that they are unfirable. This is pretty self explanatory, but there's a perverse incentive to overcomplicate your job so you come off as indispensible, it's like CIA and wall street lingo but for devs.
Of course I'd doesn't help that many people just go along with it for a paycheck.
You've got people that want to sell their cool shiny thing as a solution to anything and everything, who cares about the consequences. Everyone knows that these decisions ripple through time, but they don't care about that.
And finally, there's the guys that just don't know what they're doing, bit off more than they can chew, are in over their heads.
All of this leads to miles of technical debt, an industry made of it, increasingly unusable systems that require teams to understand and maintain.
I don't know that there is a solution. I don't know that it could happen any other way. But I do know that regardless of that, these systems cannot stand the test of time when built this way. If you want a future where computers serve humans and are ubiquitous, this path won't get you there.
So many engineers have no backbone - use your leverage. You are the one writing code, not some PM.
There are sane escape hatches today that will give your team productivity multipliers and allow you to blitz past these "resume driven development" companies.
Render.com, traditional server-side rendered frameworks, etc.
Advocate for yourself and your team. You will be surprised how much leverage and control you have.
That depends on the individual developer. For example, I'm working to clean up the mess that has become app dev w/ JavaScript (https://github.com/cheatcode/joystick), but I expect many will dismiss it short-term because it's not "what everybody else is doing" (despite being far simpler and clearer than the state-of-the-art).
And therein lies the problem: groupthink. There are very few people asking "how do we simplify this" or "how can we make this more clear" and a whole lot of people trying to impress each other with their galaxy brain knowledge of unnecessary tech.
The good news is that it's not a technical problem, but a cultural one. People are afraid to think independently and so they just go along with whatever the "best and brightest" say to do (which is usually an incentivized position due to existing relationships/opportunities).
I'm still doing my projects with LAMP technology. With my own framework with a 150 lines kernel, routed with FS, looking for the maximum simplicity as principle number one.
Postmodern web development lost the Doherty threshold.
I measure my page load in tenths of a millisecond. Average page generated in 1-9 milliseconds, including the tipical 2-6 simple SQL local queries.
Your complexity is my competitive advantage.
If you are in a position where you see piling up complexity does not bring in more satisfied users and more money that is a great time to set up a simpler competitor that will do things on the cheap in a less complex way.
HN's perspective on FE Development is that it should be a meager skill that can be on-boarded with ease, and there is frustration with frameworks (most notably, React), because what was once done with HTML/CSS and a sprinkle of JQuery has exploded into an actual field and specialization.
And, personally, I think it's warranted.
The explosion in demand for what the front end requires is just in correlation with the specialization of the field.
15 years ago, we were just doing blog posts with form submissions, now, we're trying to pave the way so that applications like Photoshop can be accessible in the web.
HN is getting old. There's no doubt about it.
And the young ones are starting to lap you guys.
I just hope I don't grow as bitter and hostile when my brain no longer can pick up new esoteric material with ease as some of ya'll.
But most importantly — if it really is, as you say, you can profit of that. Open a consultancy, and solve client's problems without using these over-engineered solutions. If your competition truly wastes a lot of time, then you would be able to solve the same problems faster and in a more effective manner.
Software engineers are lazy, don't want responsibility, just want to have fun and be creative. That's not a recipe for good engineering. The industry will continue to chase its tail as long as we don't treat it like a real engineering discipline.
This is the definition of cargo cult, and companies in our industry have a higher than average tendency to behave this way.
Most technology, principles, methodologies, programming patterns, project management patterns etc etc, are subjective, as in they work well for certain projects... not all. Even the massive over complexity we see, for example in containers, are sometimes worth it, they have their place. The issues come when people start behaving as you have found by copying what others are doing because they make the over simplistic relationship between their chosen tools and their success as a business.
Either convince your peers, or even superiors that mimicry is a poor basis for technological choices (best argued by doing the analysis yourself and pointing out the real world applicability), or find a different company that understands this (they do exist).
My last employer did this: a facial recognition developer in the enterprise space; where every competitor in the industry has a server stack for their solution, we had a single integrated application replacing the entire competitor stack, and the entire thing can run on an Intel Compute Stick perfectly fine. The kicker is such a solution is exponentially less expensive to own and operate, and is exponentially less expensive to create because the types of people with these tight skills cannot find work, they are burned out game developers with extremely high optimization and complex simulation experience. They look at the complex world of web/mobile/modern development and simply want to cease writing code. I find them and we create enterprise killers.
1) some of these things (e.g. node, microservices) already peaked a few years back, being overapplied and now the pendulum is swinging the other way
2) others (e.g. Kubernetes, React, monorepo) were developed at large, profitable companies that others wish to emulate (or work at someday), so they find excuses to use them. This case takes longer to reach a point where things swing against it, because everyone wants to pretend their company is the size of FAANG or will be soon, but the same process of overapplication and backlash happens eventually
3) in the midst of all that noise, there are some new things which are in fact a good idea for most developers. I don't know Rust or Go, but perhaps they are examples of that.
The key for us as developers is, unless we wish to work at FAANG, try to spot (3) in the forest of (1) and (2), and don't let (justified) annoyance at (1) and (2) blind us to the fact that (3) is out there as well.
Need container orchestration? K8s is the best on the planet far and away
Need accelerated compute? Rust is a fantastic language that saves us from c++
These tools are all fantastic and we should be very grateful we have them. If people are using them outside their use cases then that’s just bad engineering.
Phone calls are stupid complex nowadays compared to the old point-to-point wiring, but we can still very easily "pick up the phone and dial." It's an abstraction/mental model that's held since PBXs became automated.
When I studied machine learning 20 years ago, it was barely used, and everything was "from basics." The applied stuff was very simple, like an auto-encoder. Today, the way you think about, and teach, ML is not "a matrix here and a vector there," but in combinations of ANN layers.
There is also an increased amount of saturation it seems in the development community. Many people who learn to leetcode before they learn the basic why's.
On one hand it make work unnecessarily complicated and in some cases creates political problems because the more complicated a solution is the more governance, and the more governance, the more politics. A lot of these solutions get put in place because the people promoting them are incentivized to be the one who found the "solution".
On the other hand it creates new opportunities for those who have the courage to not accept other people's assumptions about what "good" is and to find out for themselves and separate the wheat from the chaff. If you can adeptly use Occam's razor to decide for yourself what works and what doesn't you'll be ahead of the curve. Just keep calm and code on.
The only people who are screwed are the people who follow hype. The rest of us are just fine.
I try hard to walk my talk here but I catch myself doing it too. Simplicity is much harder than complexity. It requires more thought and deeper conceptual integration. Right now I am rethinking some older things and trying very hard not to second system effect it.
On top of this you have an industry pushing this stuff and cloud vendors who love the added cost it brings from managed services and more overhead. Cloud makes money off complexity. Makes it harder to move too which improves lock in.
Lastly you have the fact that our industry is cash flush. There has been little need to trim the fat. Just raise more VC or add more billable SaaS or… we’ll crypto comes with its own casino revenue.
If you read up on the sociology of professional specialization you’ll learn that most technical complexity in a field is there for competitive purposes. Jargon exists more exclude and obscure than to facilitate.
So one predicts less productivity as competition increases lead to complexification of professions. This is all because higher education is broken. One of the functions of higher education, perhaps it’s most important function, is allocating human capital efficiently. It’s fully derelict in this, preferring instead to sell credentials to labor that labor doesn’t need, at the expense of the debt holders and students, to the delight of corporations. The result is zero productivity going back to the early 70’s.
Building a website today requires learning a ton of different tools, languages and frameworks. All of them being moving targets, so by the time your website is done, some of its components are already deprecated.
And you could go a long way with a $5/month shared web hosting service. Now, "the cloud" is not only super expensive, but it's also very hard to even guesstimate how much the next bill is going to be.
Most people who would have made their own website now just turn into using platforms such Squarespace, or even don't have a website any more and only rely on Instagram, Twitter and TikTok.
IMO, it's one of the reasons that Phoenix LiveView is so appealing for people because it removes so much complexity from building otherwise complex tooling.
I actually just had to come face to face with this because I've been developing a lesson plan to teach my son to program...and after looking at everything I settled on Linux command line + HTML/CSS + SQL. Then the decision came down to which language to teach and I narrowed the field to Ruby, PHP and Elixir.
Ended up settling on Elixir simply because of the functional style and total capabilities without having to introduces tons of additional technologies.
I don't know where you work, but the majority of enterprise jobs out there have always been focused on what's dominant in the industry. Today that is, as you say, Blockchain, NoSql, crypto, and micro-frontends, etc. While hearing trendy words like that make me want to hurl, that's how these run of the mill companies operate. They don't have time for an optimal bare bones approach, or building things from scratch. Again I could be completely wrong, maybe you work somewhere really cool that does more exciting work outside of business logic, react development, and dockers that dock other dockers into kloud goobernety docker sockets. But the point I'm trying to make is 90% of tech companies aren't very pretty, and as I'm sure you know that's part of why they pay so well and are stable.
Non developers in tech like recruiters always seem to focus on new languages as if they're inherently better. And in some ways they're unwittingly right, there's less issues with backwards compatibility at least, thus more room for new features that eventually become concerning backwards compatibilities. And of course in some ways they're wrong, newer languages are less mature, and some argue that programming hasn't really changed since the 70's. This isn't very reassuring when compelling features are GC, serialization, concurrency, and really dumb things like attractive syntax sugar that you care less about when you're in the woodwork anyway.
That Python to Rust talk does sound kind of stupid though. Almost like a higher up with enough power to make subordinates listen to them ramble about their favorite sports team programming language. I almost want to guess that whatever it was could've been done in C++ 20 years ago, but like I said, language wars are stupid and trivial. Interpreted languages are slower though, but that's pretty obvious.
I don't think we're screwing ourselves into something we're locked into, these are just dominant in enterprise roles.
The problem is:
a) Inexperienced developers that confuse jumping on hype with "modern" and sound engineering, especially when the project is not something to be deployed and forgotten about but something that will need to be maintained for a decade or more (will your Kubernetes or blockchain be still around in 10+ years?).
b) Clueless managers that allow it to happen (or, worse, actively push it)
c) Spineless hucksters that would sell you the Moon as long as they get their provision.
Neither is the fault of the technology or the engineers who have created it.
Heck, I have recently witnessed a representative of a company manufacturing mining excavators (this type of equipment: https://daemar.com/wp-content/uploads/2018/12/dreamstime_m_8... - company is not Daemar, though) giving a breathless talk about how they "innovate in metaverse" by giving their customers the opportunity to buy NFTs of the pictures of their excavators. Seriously, not making that one up ...
That's just general lack of common sense, general lack of understanding of who your market is and what your customers are actually asking for (hint, NFT it probably isn't unless you are in the business of yet another crypto Ponzi scheme) combined with FOMO.
And the company management either gets it - and tamps down on it or the company will go out of business at some point.
This is not really about software - all of those things have their places and can have great benefits when used in the right way for the right purpose (not because it is trendy, modern or because the competition is doing it too) and by people who actually understand them (and the consequences of deploying them).
Google/Meta/Amazon are the last bastions of sanity. They use stable technology that works. And they keep taking more and more of the total software market as a result. The methods of avoiding this kind of buzzword-driven development are increasingly only extant within big tech. Other companies make due with perpetual-junior developers and people who can't hack it at big tech. These companies will never develop an engineering culture, and they'll never break away from the "CTO heard about web3, and now we all must use web3, somehow" dynamic.
If these things aren't at 100% you're just adding to problems with more things, not solving them.
Perhaps after a downturn, things will revert to the mean.
For example, we have deliberately stuck with a single Node.js/Next app in a single repo using Postgres running on Heroku. We are 5 engineers now and plan to keep it that way for the foreseeable future, even as the team grows.
There is some complexity we probably don't need – the JavaScript ecosystem is notorious for this – but what we use is all reasonably boring tech at this point, and it allows us to stay productive as a team, focusing on delivering value instead of just maintaining things or chasing trends.
I think some of the complexity stems from trying to make digital things that simply aren't or shouldn't be.
The industry seems to be constantly spinning tires, putting a lot of effort in rediscovering mostly the same things every decade, while really hard problems remain unaddressed. That's clear when you see most important algorithms published before the 80's.
Don't managers understand that development is a constrained resource? They have to choose which projects move forward, where people are assigned, and increasingly, which outsourced service to use because they don't have enough in-house resources to turn to.
My cynical view of the move to complexity is management (or their C-level superiors) are often sold on new platforms or "standards" that require it.
This is at a startup that doesn't even have 100 concurrent users, and their data and queries are nothing special.
Why the heck can't we trigger an email from our internals? Oh, we don't even host our own email... because we're using a different company to host ALL our emails, documents, filestorage, etc...
i'm_in_danger.gif
There are so many different ways to build web services and the hardware (CPU/GPU/RAM/network bandwidth) and the software (OS/Nginx/Python/PHP etc.) have become so good that at the end of the day, they all work, more or less, which means that such complexity can always be justified.
I feel like software written for embedded systems to work with physical world suffers less of these issues because the environment is just less forgiving.
Just focus on making something great, and don't get too caught up in all the fashion. Software lasts way longer than people think. No one cares what brand and type of hammer a builder uses to make an amazing atrium. Likewise, no user once though, this video editor would be better if it was written in rust and ran in kubernetes.
But why are s/w developers worried? As long as tons of advertising money find their way into glorified blogs, they will get paid, no matter how much complexity they invent to justify their workload.
Kubernetes is complex and FTPing some rb files would be simpler: until one of about 145 different situations arises that kubernetes forced you to accord for ahead of time.
Whenever you find yourself complaining about the complexity of a tool: ask yourself “am I smarter than everyone in my industry, or do I possibly not understand the problem entirely?”
https://deft.com/blog/cloud-repatriation-isnt-a-retreat-but-...
Software is malleable, people are generally smart. It may take longer than you hope it does but things will shake out just fine as teams/companies are forced to look critically at their infra spend vs utilization and adjust accordingly.
Yeah there is a lot of overbuilding and BS in our industry, but I don't think we're unique in that regard. It is safe to block out the noise and focus on what excites you.
https://pingineering.tumblr.com/post/116038532184/learn-to-s...
(let me know if someone has a better link.)
So to answer your question - most of the industry will not move to simpler solutions. It goes without saying that a small fraction of the industry does require those complex systems, but they are relatively rare.
We probably are doomed if there is no push back and debate with the vocal minority. Silence is often mistaken for complicity.
What you say about needless complexity is a very valid point, but it's just growing pains imo.
Industry is infested with people who hate programming but love status.
Whoever develops the “Visual Cloud” IDE for writing scalable web apps where everything “just works” will be about $10B richer…
I don't think we're screwed.
I agree at times complex solutions are prioritized for the wrong reasons - e.g to create more work, buzzwords, or to look nice for hiring and investors. But ultimately these are tools with tradeoffs.
I happen to like K8s, monorepo, and Go because they solve problems that I have personally run into. I think crypto goes too far and doesn't really solve anything.
In terms of complexity, I don't see these tools as going from algebra to calculus, but more like re-learning variations of algebra over and over - sure its tedious, but its not rocket science.
However if you don't like dumb industry trends that don't create business value you can always go work for a series A startup. They DGAF about the frills or buzzwords, they just want fast results.
We should be less original and try to copy mathematics - their theorems are valid for thousands of years. Our codebases last maybe a decade.
The developers don't really lose in this situation unless they are owning the businesses.
I'm not saying this is a good thing. Just assessing the reality.
Engineers love to overengineer, because they can. And because it’s a lot of fun.
And then they end up shooting their own legs with unjustified complexity.
I'd suggest go work somewhere else.
So I wouldn't say we've screwed ourselves.
Would we be better of if we took a different path? No one can or will ever know
Recently, one of Alan Kay's talking points has been that "software engineering is an oxymoron", and I couldn't agree more. What he means by this is that, instead of the principled approach to design and development characteristic of other engineering disciplines, software people do little more than what amounts to tinkering. Partly this blame lies in the shift to agile methodologies, adapted whole heartedly with little understanding of what the old style process was doing. Projects, moving incrementally, are stuck in local maxima in the name of "product-market fit".
That's the demand side of things; you've described the supply side pretty well. Developers like dealing with problems, so they naturally and unconsciously seek out more complexity. If you look at how even mediocre developers can make >200K easily now, it's not hard to see how that's a massive problem for everyone. All this complexity, especially from getting the various separately developed components to work together, gatekeeps the profession and business of making software. I'm at one of the companies that doesn't spend the most to hire, or have the shiniest perks, and let me tell you, we're desperate to get anyone we can get. This is unsustainable, and I worry we need to solve it before AI takes the means of programming out of our hands.
So, what is to be done? There are plenty of examples of software that gave the non-programming masses a means to build. Spreadsheets like Excel are by far the most popular, and have driven corporate computer adoption since VisiCalc came out in 1979. When they were simple, scripting languages like PHP and Perl could be handled by a non-engineer, as long as the admin side was handled. But I think the most interesting cases are those of full, contained programming and authoring environments, like SmallTalk and HyperCard. By being the entire system, they could cut out all the accidental complexity brought on by these interfacing components, and instead let users focus on building software. Importantly, they don't deal with the concept of files for code - instead it lives alongside whatever object it's relevant to. For better or for worse, object-oriented code is easier to reason about and empathize with. The more imperative code gets, the more the programmer is forced to play computer, which I think is the determining factor in gatekeeping programming today. The way forward is having the computer explain itself, be visible, and unsurprising, which modern stacks seem to be moving away from.
Furthermore, it's becoming more and more hype/marketing driven.
Solutions are adopted because they are popular or "cool". CV-driven development is becoming the norm.
Web tooling is better than ever. I can very quickly spin up a full-fledged production grade app with very little investment. I don't worry about blockchain or NoSQL or any of that. I just use tools that make me a productive engineer and that's ultimately what companies are interested in. If you're worried about recruiters asking you if you've looked at modern languages, then you've got some bigger fish to fry. If you don't know the language, the answer you should feel like giving is "I can learn anything, and I'd be happy to prep for the job."
I'm currently working on a statistics website for a game named Smite. The ingestion engine is powered by Go/Redis/PSQL/Docker, and the frontend is Next.js deployed on Render.
This is hardly complex. The Go binary reaches out to the Hirez API service, requests some data, caches it on Redis (in case we need to run the ingestion multiple times during development and to avoid service quotas), and then stores the data in a normalized data structure in Postgres. With Postgres I can now run SQL queries on top to gather stats about the playerbase and the games. All of this is done on my local machine. My MacBook has about 1 TB of hard disk space, which used to be unheard of a couple of years ago, so I have no worries about my database growing to a size I can't manage (old matches are also pruned and removed).
The next part is the frontend part, which is what I'm working on now. But this is also super simple. I'm using Next.js to statically render a website using SSG. I basically reach out to the Postgres database locally, grab the data points I need, render the UI into static HTML files, and then I just take that build, push it to Git and it triggers a job to deploy it on Render. All of this tooling is ridiculously and refreshingly simple.
I think you're really overthinking it.
Newer programming languages (the ones from 10 years ago like Go and Rust) are much better than the ones from 30 years ago like Java and Ruby. This doesn’t mean that they should be used for everything but especially the simplicity of Go is always putting a smile on my face whenever I can use it. Compare that to Gradle Maven Spring Boot whatever Java stack - there goes your unnecessary complexity.
What you also have to understand is that many of the things you complain about are solving non-technical problems. Monorepos are great at breaking up silos between teams and enabling vertical development of features across the stack in an organization. They come with added complexity in terms of tooling and automation needed. It’s a trade of that might look bad if you only take the tech aspects of it into account.
Kubernetes in the cloud and its sister systems may look more complicated to you as a developer, but if you compare it to managing a physical data center including all the staff needed to operate and maintain it, it’s really much more simple especially when dealing with hardware failures, dynamic scaling etc.
I'm still suspicious of Guava, let alone Rust.
But whether or not they're overly complicated, I think the reason why these things are grating is because they're less fun than coding up solutions to problems 15-20 years ago. Configuring containers is a pure exercise in versioning hell, and with the emergence of devops, it's impossible for developers to avoid.
nobody is focusing on the key issues of our industry and what is important and what needs to be changed, to have healthy, productive, respected and comfortable professional careers. to make this happen a shift in focus needs to come about.
This is an important writing on the topic
- Trend-following (Mgmt. FOMO) is real
- Resume-driven development is real
- Sometimes the 'new' stuff is better
IIRC I found this site because of this essay by Paul Graham:
http://www.paulgraham.com/icad.html
TL;DR: don't worry about industry, what's the most efficient way of doing things? Do that.
(Only half joking.)
It was never our industry. There was a brief window ~2008-2010 where software engineers had a lot of power within their orgs, but at the end of the day we were always the laborers, never the owners of this industry.
Capitalist loathe a monopoly on skill, it gives labor a dangerous amount of leverage.
The people who own our industry, mostly venture capitalists and other investors are interested in capturing value at all costs and limited this power that engineers had.
This was the drive behind "MVP" and "ship it!" cultures that are partially responsible for this mess. But complexity is also valued by management because it reduces the ability of an individual engineer to have an impact, thereby reducing their monopoly on skill. In addition we've seen an industry pop up which focuses exclusively on rushing in new, minimally skilled, looking to make a quick buck devs. These are people only know how to fiddle knobs in a big complex machine.
This is also why the hiring process is even more awful today than it was a decade ago. A decade ago anyone who was passionate about programming and had a github repo willed with cool projects could get hired. This has been transformed into a machine that seeks to make sure every engineer is the same, trained only to pass a series of algorithmic puzzles from leetcode and hacker rank. These are even different than what they emulate: the old google challenges where hard but given by devs who knew what they were doing. Half of the algorithm puzzles I've been given in recent years are clearly by devs who only understand what the answer is, but don't really have any deeper insight into the problem.
> are we doomed forever?
Only until this latest wave of tech (it's not really a bubble) crashes. Once demand for software skill plummets then it will likely be like the dotcom burst valley of 2004-2010. The only people doing software where people who cared about it, and because salaries crashed many good engineers found other niches they could apply their skills in. That's when you saw some really interesting problem solving going on in the field.
I'm releasing a (web) development platform for it this month. Just getting the code ready.
We as a profession spend a lot of time _solving the same problems_. This isn't necessarily a bad thing, different implementations allow for specialization in unique but important ways. Where I think we've gone wrong is that we can no longer generically re-use a lot of code between code bases because a lot of those libraries are written in dead-end languages.
What I'm referring to as dead-end languages are any programming language where you can't use library code independently outside of its ecosystem. Golang, Erlang, Javascript, Python, Ruby, the entire Java land of languages are all one big ball of intertwined dead end ecosystems, even Rust to a lesser extent. Any library written in one of those languages is locked in to that ecosystem and will never have a chance at becoming a generic foundational building block for systems outside their ecosystem.
One of the reasons we're even able to rapidly build so many complex systems is the foundational libraries like libcurl that have "solved" a problem well enough and is reliable enough that it is effectively an easy default decision to use them. These are libraries that have more or less solved some hard problems sufficiently that other engineers can mental model them away without knowing the implementation or protocol details.
I've seen others compare these modern methods and tools to old-school in-house one-off development and how difficult that made things. This is the same effect but rather than lock in at a company level, its lock in at a language or library level (don't get me wrong this is generally better than random in house one-offs). If you're familiar with the Golang net/http package that mental model can't be transferred to another language and there is no way to expose that functionality to a language other than Golang due to how the language itself is designed.
As frustrating, old, decrepit, and unsuitable for a lot of things as the C ABI is, any language that can produce a library that exposes its functionality using the C ABI is table stakes right now to avoiding the sprawling landscape of language lock-in. Even in languages that support exporting libraries using the C ABI, there is always that concept of 'other' that seems so problematic to me. It's not _bad_ writing a library in Rust, but that boundary between Rust-land and the other is uniquely un-interoperable or overly repetitive in its own ways requiring layers of abstraction and special behavior to work. For example if you have two separate system libraries written in Rust that do all their work behind the scenes using tokio are they actually going to be sharing that runtime? No. There is no common libtokio.so file on the system, no cooperation or resource management between the two, and no common library to update if a security vulnerability gets detected (for the pedantic, I'm referring to pre-compiled distributed libraries as a system building block not the common source you can compile on your own). This specific problem of bundling specific versions into the compiled artifacts makes inconsistencies in the systems running the code have a lesser effect, but you end up having to deal with log4j like situations where you're entirely dependent on the packager, maintainer, or vendor to handle your security updates and trust that they got it right.
I think one of the big reasons we're experiencing this spiral of complexification comes from the fact that we're not generating those foundational building blocks any more. There is no refinement tuning out the complexity of the system and distilling best practices into library defaults. There is no common underpinnings being generated that can be maintained, understood, and diagnosed system-wide. We can't reason about this utterly shattered set of walled ecosystems.
I think there's a lack of theory for software complexity. Complexity is a loaded word with several definitions, so when I used it here I mean software complexity in the sense that we don't have a theory to explain how things should be modularized and grouped. In addition to just modularizing things and grouping things, How you group and modularize protects your code from future technical debt, but each modularization may also come with an associated performance cost. There just isn't a theory that unionizes all of these things together. There isn't even a theory that explains just the modularization part without the performance cost.
When we don't have a theory for something we have a word for how we operate in this realm. "Design." This sort of stuff still exists in the realm of design. Anything that lives in the realm of design is very hard to fully optimize. Industries that incorporate the word "design" tend to move in trends which can ultimately be repeating circles because each change or paradigm shift is a huge unknown. Was this "design" more optimal then the last "design"? Is modern art better than classic art? Who knows? In fact the needle can often move backwards. The actual answer may be "yes", the current design is worse then the last design, but without a quantitative theory giving us a definitive answer we don't fully know, and people argue about it and disagree all the time. There are art critics but there are no math critics.
Take for example the shortest distance between two points. This is a well defined problem and mathematically it's just a line. You don't "design" the shortest distance between two points. You calculate it. This is what's missing from software. Architecture and program organization needs to be calculated not designed. Once we achieve this, the terms "over engineering" and "design" will no longer be part of the field.
If you squint you can sort of see a theory behind software architecture in functional programming. It's sort of there, but FP doesn't incorporate performance costs into the equation. Even without the performance metric, it's very incomplete, there is still no way to say one software architecture is definitively better than another. There may never be a way. Software may be doomed for a sort of genetic drift where it just constantly changes with no point.
The complexity of software will, however be always bounded by natural selection. If it becomes too complex such that it's unmaintainable, people will abandon the technology and it will be culled from the herd of other technologies. So in terms of being "screwed" I think it's fine. But within the bounds of natural selection, there will always be genetic drift where we endlessly change between technologies and popular design paradigms.
You are awesome and cool and raking in the kudos and bucks if you are piling yet more stuff (especially big, complex, and unstable stuff) on top.
You are a stupid nobody loser if you are the dutiful maintainer in Nebraska.
> Any headline that ends in a question mark can be answered by the word no
https://en.wikipedia.org/wiki/Betteridge%27s_law_of_headline...
Are all CRUD API’s insensitive to performance? Correctness?
Right now the assumptions are heavily entangled with the platforms: we know that our audiences are on "X". Therefore "X" becomes part of our strategy, and our end goal(from an evolutionary success standpoint) is to make them standardize on our "Y". This happens from all parties: devs, consumers, forms, governments.
Thus when I boot up Windows I'm confronted with a cacaphony of different updaters, notifications, etc. All of them trying to exploit the platform I'm currently using to pull me deeper into some other platform.
If we fast forward this process, we can see that the nature of computing ecosystems is to be a jungle that exceeds understanding. And in jungles, there are a multitude of niches. Compatibility is situational. Although one could point to apex predators, they don't exactly "rule" the jungle.
Which means that the appropriate goal to achieve in a software's lifecycle is most likely a sustainable niche that only needs to know about a few things. But our industry is not doing this yet. Why? Because software has not eaten the software industry yet.
That is the culmination of all these decades of churning on code: eventually we end up with software that is better at coordinating information and activities for society than any human-mediated organization could be. And you look at the technologies we have, and assume there's a logistical function to them, and it's like: OK, maybe AI can do that. Maybe blockchain can do that. Maybe cloud and no-code frameworks can do that. Maybe if you bodge those things together, you end up in a place where the professional developer isn't dealing with as many details, like photography vs painting. And if that's really the case then you don't have to write nearly as much of an app: it will start hooking into the ecosystem readily, instantly presenting the views on information that you need and filtering noise for you.
We haven't had a really fundamental realignment of the economy since the end of World War II. And if you look at movies from then, the economy that emerges is sensible within its concepts of how economies should move forward: information was still expensive and while many novel things could be mass produced, you needed firm structures to coordinate them(Newspapers multiple times a day! Icebox and milk deliveries! Mail-order houses!), and you needed a new set of infrastructure to animate this action. Highways, supermarkets, shipping containers, and TV were all representative of where the world was going: an ecosystem of "products and services". And you could learn what products and services a city had by walking through the phone book and making calls.
But over the past few decades, it's saturated into an "attention economy". There are so many goods available that you'll never know about all of them, so the information systems have to take up the task of digesting it and leading us towards our best lives. So the task of making software simpler is also a task of making economic coordination simpler. And we are still going to be using the products and services framing for some time, but it's likely to get weird.
1. Legacy codebases accumulate not only a ton of "normal" tech debt but larger codebases within larger orgs have the battle scars of a sort of "forced evolution". Bandaids on bandaids, pulling in new frameworks and patterns all requiring often heavy transplants. Like a future archaeologist finding artifacts of the steam engine, combustion engine, nuclear reactors and lithium batteries you can infer the good intentions but unlike the original implementors you can clearly see, with hindsight's 20/20, the "unforeseen" side effects they couldn't (or were just incentivized not to). Unlike those societal-scale energy innovations the microcosm of human engineering optimism and naïveté that is your organization's legacy codebase had a more rushed "artificial" evolution that was likely a casualty of the kind of short-sighted, rushed product-roadmap dynamics we're all too familiar with. Less a million-year evolved shark and more a "cute" purebred pug. Can we go right to the shark? Probably not. We'll make hundreds of thousands of pugs before we ever get to the shark. In the meantime the optimist will see these pugs as forcing-functions that help evolve the encompassing ecosystem at-large to be more conducive for the entrant of a "shark".
2. You ask "have we screwed ourselves?" and I think this might imply too much confidence in the perception of how "separate" we are from these iterative codebases. To go back to the evolution metaphors, we're just introducing mutations under real-world conditions. Each product of that evolution is a reflection of those conditions more so than the potentially great ideals any single stimuli in the petri dish possessed at the time she took part in the engineering. By the very nature of large-scale, cooperative engineering we bake-in our collective foibles and the engineering disciplines we hold at any given time are just one, usually smaller, dynamic at play.
3. I may sound pessimistic, over-deterministic or that I think our efforts are futile in light of larger dynamics but I'm not, I'm sipping coffee right now, I'm good. Acknowledging that we have an outsized perception of our affect in these situations may help relinquish you from the oftentimes infuriating pain that accompanies the mundane banalities of daily software engineering. Champion the better ideas, sure, but maybe with a good-humored flexibility that comes from knowing today's great ideas are just memes bootstrapping the evolution of tomorrow's much greater ideas and in this chaotic soup there's some beauty.
Trust me, this is not the Zen outlook I'm able to stay within at all times (or even most of the time) but I want to try to and also remind myself why I thought all this software stuff was cool in the first place. I'll end with a quote I'm reminded of.
“For the simplicity on this side of complexity, I wouldn't give you a fig. But for the simplicity on the other side of complexity, for that I would give you anything I have.” ― Oliver Wendell Holmes