hyperbolic exaggeration or plain ignorance
What popular tech advice you wish you didn't follow?
In particular, typing into an 80x25 terminal window with unformatted fixed-width characters should be obsolete; we don't need compatibility with punch cards any more. There was a thread a week ago about displaying graphs in your terminal using ASCII graphics. This shouldn't be necessary!
Also, code shouldn't be a sequence of 80-character lines like a deck of punch cards. It makes no sense to carefully organize the order of code in a file for things that don't matter. Chunks should float around and be displayed as sensible, not tied to an arbitrary location in a file. Code should be more like a database than a flat file. (Yes, I know the arguments that flat files are better for tools like grep.) We should get out of the 1970s editor model; Emacs is from 1976 and vi from 1978.
The whole command-line operating system approach is like being stuck in the 1970s, even using the same commands. Yes, small tools composable with pipes are nice, but there must be a better approach. We have powerful graphics, but we're typing into terminal windows that simulate a Teletype.
To summarize my unpopular rant, hardware and I/O is exponentially better than the 1970s, but our programming environment is still stuck with the approaches of the 1970s. Computing needs to get out of this local maximum. For a supposedly fast-moving field, computing changes at a glacial pace.
The thinking is that with docker you can manage your dependencies in one place, and so you don't need to spend time and headache dealing with issues like incompatible packages.
In reality, you trade that time and headache for the time and headache of dealing with docker: did you mount all the files you needed? Did you start the container with the proper command to enable debugging? How do I attach the debugger to my process inside the docker container? And personally I don't like the extra "state" I have to maintain in my head that I'm in a docker container.
My overall argument is that if the dependencies are so crazy that you're wasting time on conflicting packages, take a hard look at those dependencies and see if you really need all of them, or if you can simplify.
To be absolutely clear, docker is still great for certain use cases. For spinning up new CI machines it's fantastic and really simplifies that process in a meaningful way. But for local dev I think the advantages/disadvantages wash out and makes it either a net zero at best or negative at worst, since that "state" will always be there.
We need operating systems that enforce file selection, instead of deferring to the applications. This is known as capability based security, and most people think of "allow access to X" on their phone, which is NOT what I mean.
When you want to open a file, an application shows a system supplied dialog box, then goes ahead and opens whatever the heck it wants with the full trust of the OS, no matter how wrong that is.
All I'm asking for is the OS to handle the file dialog and pass a capability (like a file handle) to the app instead of trusting it blindly.
The downside is that unless there are clever shims, this requires rewriting everything.
---
The reason we have VMs then Docker then Containers is that our Operating Systems don't enforce capability based security, and we've been improvising a replacement for it in this manner.
These tools allowed domain experts to get a GUI tool built in a matter of hours, and for the most part, those solutions just worked.
Microsoft Access 2000 managed tables and relationships automatically. You could have master/detail relationships multiple levels deep and it all just worked. It did queries well enough, hooked to SQL via ODBC, and did pretty good reports.
It was a pretty cool environment when you tied it all together.
- Then Microsoft got obsessed with .NET and Borland got greedy, and ruined it
On the Web side of things, there were WYSIWYG Web editors that worked in much the same fashion, with a whole stack to support it. You could lay out a Form, and have a CRUD app running on the web in short order.
- Then Ruby on Rails came along, and ruined it
The main thing we've gotten in the past decade is faster hardware, especially thanks to SSDs, and GIT. GIT is amazing, but don't be fooled into thinking it actually stores deltas, it just acts like it does. GIT is a snapshot tool, way better than stacks of floppy disks/USB sticks labeled Archive001 -- Date
Every time I hear someone wanting to rewrite something without a VERY solid reason is a flag for me. It is hard improving a system that’s already there, but can be super rewarding if you change your mindset.
Everyone is eager to go with a blank canvas and greenfield a project but ultimately we’ll end up with the same shit situation in 5-6 years after the original authors have bounced and the newer devs are side-eyeing the code.
- Writing Rust is generally unpleasant - trying to figure out how to do something that you don’t yet know how to do is much more painful than doing the same in another language.
- Unit tests are overrated.
In a more personal context, I do care about software quality… but this only applies to personal projects.
There's no difference to a business between first-gen infra as code (terraform/cloudformation) and second-gen (CDK/pulumi/etc ...). They both should be replaced by a developer platform that runs in your own cloud
The best tool to build on is something that combines the two with one simple config, like what we're building at withcoherence.com!
* A design which can be generalised is inherently more complicated, and is often unnecessary (you probably won't make use of that generality, leaving you the burden of maintaining it), so you'll be glad you made a simpler first pass.
* I don't like working with docker/containers personally. It makes me feel so far removed from how services are deployed that it's basically magic.
Nix is the single worst thing that's happened to Haskell development tooling. The last 9 months at my current job have been Nix misery non-stop.
Nix in general is hot garbage.
Today the market moves too fast, priorities have changed, and there seems to less of a willingness to "do the right thing".
As such, abstraction for the purpose of avoiding copy/paste/modify is usually bad. You should just copy/paste/modify, and maybe extract out common functionality if the common functionality is obvious and well-contained.
Epistemic status: I don't fully endorse literally the above take, but it gestures in the direction of my actual opinions if you're starting from the viewpoint "Martin Fowler's Clean Code is approximately correct", which seems to be the majority viewpoint in my circles. As always, take into account the rule that you should probably reverse any advice you hear[1] -- if this take resonates with you, it is unusually likely to be bad advice for you in particular.
[1] https://slatestarcodex.com/2014/03/24/should-you-reverse-any...
Job-Security sub projects. At first you have a bright eyes junior dev who wants to do the right thing, change the world they said.
Little by little he realizes that documenting everything, writing easy to follow and bug free code is a great way to be out of a job.
Complex code is what gets you promoted because you have fire to put out, bugs to fix, getting pulled in meetings where you have to explain the intricate architecture.
Embedded linux is more trouble than its worth 6.5/10 times
A lot of PCB developrs pretty much refuse to perform proper V&V on their designs because they're allowed to push it off on firmware engineers.