HACKER Q&A
📣 caprock

How to balance feature delivery with internal quality?


Something that's been on my mind a lot lately, is the continuum between:

a - the velocity of delivering a product and features

and

b - ensuring internal code quality, performance, and the right architecture

My personal bias leans heavily to prioritizing the delivery of minimally complete features. I've noticed though, that many software engineers focus more on code idioms, general quality, and attempting to mitigate future risks.

Is this just one of those never ending dynamics? Or do most HN folks lean one direction? Or does the answer change based on the life stage of a company?


  👤 PaulHoule Accepted Answer ✓
I think the #1 cause of both slow development and quality problems is a mismatch between the mental models an application is based on and problem domain and the needs of the users.

If the mental models are correct is is possible to add new features and fix bugs.

If the mental models are incorrect you are at best going to be pushing a bubble around under a rug and everything is going to be expensive and broken at the same time.

Frequently organizations talk as if they were making a tradeoff between speed and correctness but when that kind of talk is in effect you usually find development is slow and it is all broken. That is, talk of a "tradeoff" obscures the fact that there is one right thing to do which is bring the mental model in line with the domain and then bring the application in line with the mental model.

For instance, splitting up functionality into small features that can be deployed into small units is not "low quality", in fact it can be conducive to very high quality because an incrementally developed system can feed experience back quickly such that the system has all the right attributes.

Where people get into trouble is with data structures as opposed to algorithms.

I went through a phase where I fixed maybe 15 seriously broken applications in six months and most of the time the hardest problems was something seriously wrong with the database structure that caused errors and made it difficult or impossible to add features. Overhauling an app like that was probably 80-90% migrating the database to a proper structure, the remaining part is almost trivial when you have the right data structures in place.

If you are thinking ahead you should be able to deliver functionality in nice small increments and never find out that "oh crap, my data structures are totally wrong and I need to rewrite this."

Facebook, Adobe, Microsoft, Google, Epic and Cerner are monopolies that don't face competition that can hire expensive developers, use them poorly, and subject users to their boondoggle. Other software companies, not so much.


👤 matt_s
> never ending dynamics

Mostly. I think it depends on the life stage of the product/app. The interesting thing about this dynamic is that if you prioritize code quality, having the "right" architecture, etc. without building the features to capture paying and keep customers then its not likely the app/product will be around long.

Make sure the app is designed to be agile enough to adapt but don't solve problems you don't have.


👤 stevenalowe
Speed requires quality; there is no trade-off.