This must lead to a lot of problems which I'll only face after ramping up, but even onboarding is painful if you don't have e.g. clear automated testing set up.
I've been in the industry for about 10 years and testing has always been an important part of the job. I guess I saw it as a given; to me part of the job _is_ writing tests. I assumed this was fairly obvious in 2021.
How should I approach this with the team and my manager? Keep in mind I'm new and a IC, not very senior. Others in the team might already have struck this chord in the past as well. Or perhaps I should drink the cool-aid and shut up.
The problem with software development techniques and tools like automated testing and style guides is that while they look like good ideas, and work at some places, there's not a lot of empirical evidence to support claims that automated tests will improve developer productivity or code quality.
Programmers usually perceive any new code base as crap in desperate need of new ideas and tools. It takes a while to get into a code base and really understand it, and by that time some things that seemed terrible or confusing at first will make sense.
The change defect rate (how many percent of your commits are made to fix a problem introduced in an earlier commit) is very easy to sample manually, and gives you an idea of the scale of the problem.
Then you can go to the team and say, "Look, 40 % of the things we do are fixing up problems we created in the first place. With better processes in place, we could free up a lot of bandwidth for things that matter."
Or maybe you find out that you have a change defect rate of only 12 %, and the things you think are important, for some reason, don't matter in this team.
----
Measurements that complement the change defect rate for a fuller picture, according to Forsgren et al.
- Deployment frequency (literally, what's the average time between deployments to production)
- Time to fix (when a problem is first reported, how long does it take on average until the working fix is deployed to production?)
- Lead time (once a developer considers themselves "done" with the code, how long passes until that code is in deployed to production?)
The time to fix and change defect rate measure quality, so these are the ones you probably would focus on. The other two measure speed.
If they're suffering the dire consequences that you predict, the mistake that you've made is joining the team. Newcomers can't fix systemic problems unless the organization actually wants said problems fixed AND said newcomers are tasked with fixing said problems. Even then, odds are against success.
If they are shipping reasonable product, then they're not suffering said dire consequences. (You did check whether they ship before you joined, right?)
In that case, you're fairly ignorant about some key aspects of the company and how it works.
Such ignorance is a bad basis for suggesting change, no matter how good said change might be.
To elaborate: formal style guides prevent simple things like formatting code in a manner that is easier to read in some cases (“why is this newline here? Why did you column aligned this block of expressions?”), and more insidiously force you to write worse code because they constrict you. Style guidelines should be just that - guidelines and engineers should have the decency to not request stylistic changes during code reviews unless they spot code that is obviously sloppy.
As for tests: insisting on having tests for everything impedes development in the present as well as in the future. More crucially the compulsive desire to have everything testable makes you write worse code as you need to abstract away parts that would simply be function calls or use patterns that obfuscate the code. Some will claim that this results in better designed code, but realistically speaking it just results in more complex code, which is almost always worse. You don’t need to use abstractions everywhere. Developers should focus on making shit work well and not over engineering code because they want to feel smart all the while making rationalizations about testable code and whatnot. I digress though so back to my original point regarding tests: in order for tests to justify their existence they have to test something significant and/or test a module that provides a service (read: hidden behind well defined API, api which you test to ensure that it doesn’t break its “contract”)
Are tests part of your CI/CD pipeline now? You can always add them to your own code. And slowly increase the number of tests that way.
You might have a harder time adding a style guide if no one cares. You could try to add the style guide as part of your build pipeline. You might need to start by fixing all the low hanging fruit errors yourself.
Whatever you do, don’t do what I did and start complaining about stuff at every step while in a new job. It’s a big ship. Turn it slowly, or be prepared to live in the worst cabin and/or be thrown overboard.
Bring it up in your 1 on 1 with your manager, too. Just ask their thoughts. Maybe they also feel the same way, and have wanted to improve things.
Is there anything extraordinary about this job? You could always just find another job where testing & code quality are taken more seriously if it’s really important to you. If something is special, be prepared to live with the things you don’t like for a while.
There might not be a particular reason for why things are the way they are. It might have just grown to be this way.
I haven't read it but I've heard good things about "Working effectively with legacy code" by Michael Feathers.
> This must lead to a lot of problems which I'll only face after ramping up, but even onboarding is painful if you don't have e.g. clear automated testing set up.
Maybe you'll learn why this is not the case after several months of working in your new role.
* Make sure your own changes are properly tested.
* Make sure existing tests are reliable and quick to run, since these are common excuses for not doing more.
* Develop tooling to make writing tests easier, or to write new kinds of tests. Key word is "new" because that taps into the neophilia that's endemic at these places.
* Use new techniques to reliably reproduce bugs that are still fresh in people's minds, and to show how this can find related bugs as well.
Don't spend too much time on any of these things, though. That will reduce your recognized "impact" and slow the process of gaining trust from the tech-lead "guardians of culture" who have usually decided (based on a mere 5-10 years' experience at only one company) that their current approach is perfect. Better to maximize that impact and then start promoting change. So, to a large extent, "drink the Kool Aid and shut up" really is the best approach available.
As others mention, be careful trying to change the world (and especially saying bad things about what you see) before you grow your credibility. But ask a few questions in team meetings (or, at something like FB, in workplace groups) about what test or staging or related infrastructure is available as part of ramping up, and that might get people that care to notice you’re a potential ally.
There’s even the (admittedly probably very) small chance that that the team wants to improve here, and doesn’t know how/didn’t have the resources to do better before - either way, you’ll learn more without alienating anyone.
Big tech companies also often offer mobility and culture variety, so keep your eye out for teams that align with what you care about. Learn what they do and how they got started at least - or possibly move there.
(If you’re at FB, feel free to reach out to me - same account name.)
The answers should be pretty revealing about the engineering culture at the company. You can then decide what it is that you want to do about it with better information.
You never know, maybe the Big Kahuna tells you you have open mandate to make improvements in the area...stranger things have happened.
Cut your team some slack. Before you go telling them they're doing it wrong, do better yourself. Make sure your code is well tested. When doing code reviews, request more tests strategically. (Meaning, don't complain that there are no tests. Point out specific tricky or dangerous areas where an extra test could make a difference). If they don't know how, help them.
ASK about current practices ("Hey, is there a style guide I can use to help me get up to speed?"). DON'T tell them you know better. If you make them resent you from the beginning, you'll never make any progress.
But don't talk to much about it and beware forcing others to do their work like you do. If there is something like code review meetings, you may show it to others. If there is no SonarQube or infrastructure, establish it e.g. in Docker on your system. Have a plan in mind, how to transfer it to the company infrastructure, but don't force anyone to do so. Just if someone asks you.
You have an opportunity here to integrate YOUR personal way of testing and spreading the knowledge. Have faith that good things will prevail. Give it a year...
That is a big company in a nutshell, the management is busy spinning employees so good employees immediately call out BS and leave. So what is left is okayish to not so okay employees. Ultimately the company suffers, but who cares when there are bonuses given without understanding what really is happening in the company.
I've found that there are a few key junctures at which people take to testing. Here are some:
1) When dead-simple unit-tests catch old bugs. If cos(0) doesn't return 1, there's a problem.
2) When unit tests, for the first time, catch an important new bug in code they're writing.
3) When there's enough testing coverage that you can rip the guts out of a function, install new guts, run the tests, and be certain that everything is fine.
4) When good coverage newly applied to old code finds really subtle very old bugs in code everyone trusts.
Use the easiest-to-use unit-testing library you can. If there's no test-coverage today, some testing is worlds better than no testing. Once people can see for themselves how helpful it can be, the organization can get fancier if it needs to do so. I absolutely love GNU Octave's testing framework [1], as the syntax is as simple as
function r = foo(x)
r = x+2
end
%!assert (foo(3) == 5)
When it's easy, people take to it like water. I perpetually emphasize to students the importance of writing the most-boring/simple test first.Just do what you need to do in order to be productive. If that means writing tests, do it. It doesn’t matter if there’s CI in place. Maybe only you run the tests on your local machine occasionally. Just do it. Testing is a part of software engineering and you don’t have to ask permission to do your job.
Practice what you preach and ship excellence and if you do it well with good management then you'll be able to shape the team. Otherwise, good luck because we have yet to iron out what good quality even means.
I operate at a very high level in an organization but in general I accept disagreement, keep track of what the outcome was, and if I was right decide if I want to revisit the issue with the new data. The best outcome is I was wrong and nothing needs to change, or second that I was right but the impact was small so the effort was unjustified. If it turns out I was seriously right, then we can start putting motions in place to fix the issue, and I get credit for voicing concern early. I hope to be wrong!
For all those saying "get a new job": not that simple. Immigration. If things are real shitty (which they aren't!), I'll get a new job anyway and move back home. But I'd rather not.
But if you want to try, measure downtime or loss of productivity. Then convert it to dollars assuming average rate of dollars in the industry. And put the dollars in parenthesis ($10,000) to denote the loss.
Write an email, forward it up the chain. Then when the next outage happens, gently point out that email, and be prepared to offer solutions.
Write tests for things that you work on.
If it’s valuable, and you are a productive team member, you will eventually be able to convince other to adopt your practices.
The best way to convince people to test is to show its valuable by finding/preventing bugs.