HACKER Q&A
📣 tacostakohashi

Working with folks who can't see the forest for the trees


A place I worked at once with a large/complex/legacy/profitable C/C++ codebase (as an example) had an interesting dynamic:

Some app owners, who would be like "omg our software is segfaulting", it didn't segfault before before, we need to find track down the the exact last thing that changed that made it segfault, and fix that "root cause" and leave every other dumb thing we are doing exactly the same as it was before incase we break anything, because we should only fix the exact thing that caused the most recent segfaulting incident.

Some other developers, who would be like ok... "why don't you start with fixing all of your compiler warnings", and then probably your app will gradually start working better, because the exact thing that caused it to segfault was probably one of the hundreds of compiler warnings / obvious problems with the code, and it's not really worth obsessing over which exact dumb thing is the problem this time, maybe just stop doing dumb things in general and then there will never be a problem caused by doing something dumb?

In reality, there is some merit to either/both positions... but I have seen lots of extremists, especially in the first camp, who essentially refuse to improve anything unless it has been proven to cause an incident, and never proactively, almost as a matter of principle. Like, if you suggest to them that you see something sub-optimal, here is the PR/patch with test coverage, etc, lets just improve it and take it out of the equation for future issues, they'd be like no, we must focus only on our most recent incident, and unless you can prove that the general improvement would have prevented a previous incident, it has no merit.

Much of it comes from a simplistic / naive / reductive understanding of a "root cause", since for any complex system it's more like the swiss cheese / holes lining up, if any one of several things (some of which include user traffic, other host load, race conditions, etc., etc.) hadn't been true it wouldn't have happened, so it's really useful to want to identify and fix a single "root" cause to the exclusion of other contributors.

Anyway... I am wondering:

* Is there an accepted metaphor / term for describing this tension? I guess it's basically a special kind of "can't see the forest for the trees" / "stuck in the weeds", anything closer?

* As someone pretty much in the second camp... any tips for working constructively with people in the first camp?


  👤 precompute Accepted Answer ✓
Some people just don't think doing a good job the first time around is a priority or even a requirement. You can often find that this attitude permeates everything they do and colors their life. There are only three things that can be done:

- Make them see the truth : Would mean that you'd have to ready to take some blame for their mistakes. Wouldn't recommend this if you aren't the manager.

- Get close with the manager / BECOME THE MANAGER so you can manage these people out

- Start looking for another job

Edit: These people seem to lack Associative Horizon and don't have the tinkerer / hacker spirit.


👤 brudgers
large/complex/legacy/profitable

Profitable is the forest.

The rest, just trees.

Good luck.


👤 austin-cheney
No. Well… maybe fear of simplicity.

That’s something I use personally to rationalize why people prefer super complex error prone approaches as opposed to approaches that are more durable. The reason almost universally comes down to familiarity instead of measurement as qualified by a long list of selective biases.

The best way out of this, that always works for me, is to go lower. That means one or more steps lower in the technology stack than what you are currently using. That could means eliminating a dependency, a more primitive language, or merely rolling back a shallow layer of abstraction.

The reason why this works is because forcing a more primitive approach requires elimination of some convenience. This isn’t a technology problem. It’s a people (discipline) problem, so you have to impose the means to require a more disciplined participation. Yes, this will piss people off, because it’s a people problem, but it will solve the problem and result in a superior product.


👤 yuppie_scum
I think you are talking about technical debt. Read the classics: The Phoenix Project, DevOps Handbook and Google SRE Book for plenty of discussion. Another great book on legacy software: https://www.amazon.com/Kill-Fire-Manage-Computer-Systems/dp/...

👤 mejutoco
How is blame attributed in that environment? While it is difficult to know, the first camp might be a reaction to a place where people are blamed for every mistake. It seems like a defensive position to not be blamed. Is it a place with very low-tolerance for risk?

👤 revskill
Sure, it's the same like you find the root of a square equation, instead of finding the true formula, you build a neuron network (first camp).

It's really "hard" to find the root cause. Stakeholders want approximation for solution first.


👤 kidgorgeous
Find a new job