HACKER Q&A
📣 thoughtFrame

Anyone else feels weird about the distributed/cloud-first trend?


Even though I have some years of programming experience and CS classes, I feel like I haven't learned how to understand the machine properly. I know that CPUs nowadays are insanely fast, and I know that I/O is usually the bottleneck, but going distributed first is just making things worse by making everything harder to reason about (now we need DNS, service discovery, some nodes that orchestrate other nodes, etc.) and since we go through the network it feels like we're increasing the probabilty of failure even though we don't have any single point of failure (and to solve this issue a lot of redundancy (and therefore complexity) is added).

So, seen from this light, isn't it weird that it has become such a popular approach?

And where do programmers (I'm thinking of low level game developers, graphics programmers and high-frequency trading) learn to write deliberately performant software out of a single machine before distributing horizontally?


  👤 akagusu Accepted Answer ✓
It's not weird. It's how tech vendors make money and they spent lots of money to convince us that this is the way to go.

Tech vendors don't make money from simplicity but from complexity.