I go through various forums and find that people DID A LOT of cool stuff while I was busy playing with Arduinos and OpenCV. I did learn about the kernel a bit but that's about it.The amount of development that I see people have gone through in the the last 5 years is insane. Sure, there is a lot of sugar coating but I still feel the amount of compute and resources that are available now might mark the point of no return. I hope I am making sense here.
A headhunter called me out of the blue. We talked for 20 minutes, he had one -one- option for me that was a perfect fit, and I started working at a boutique consulting agency.
Fast forward 5 years; I'm leading a group of awesome consultants at an even cooler, massive consultancy. I'm doing work for clients I never imagined working with, and learning emerging technologies. Work is fun, clients love us, my team are brilliant and funny, I completely reinvented my career, and I'm making triple what I was a few years ago. Yes, there is a lot of grind, and I don't create like I used to. I don't want to give you the wrong impression, but it's AWESOME nonetheless.
Sure, I did super-cool stuff in 1994, and again in 2004. In the 80's too, if I think about it. I miss those days. Maybe I should have peaked then, but I'm too stupid to stop surfing whatever wave is in front of me.
Opportunities abound for anyone curious and willing to take risk. Age is -not- a construct (aches and pains are true), but as you age you acquire wisdom, and it's surprising that the stupid stuff I say to smart younger people who take it as profound. I'm actually helping people. I never dreamed that 5 years ago.
Oh, hey. Tech is too big to follow. Find your interests and pursue them, leave the rest. I love generative art and followed it for a few years, but it's only recently that it's more accessible. I'm one of the leaders in a 10K+ person org. At 55. In addition to my day job. Just have fun. It all works out in the end.
Life is a jungle. Swing from the vines.
On a personal note, every step of my career has been forward and also unexpected. What I envisioned myself doing and what I ended up doing were two different things. And yet I'm happy and successful. Age 41.
Now, in my early 40s, I feel that I am finally starting to hit my stride!
It can be somewhat true societally speaking, but also insensible as a creature that is part of society. Computers are still basically doing the things everyone predicted they would back in the 70's. The tooling is nicer, but the paradigms of how they should be programmed have crawled forward very slowly relative to the research of that time. So what, exactly, has actually changed? Why do we not have to learn for 50 years to have jobs in software?
Most of what changes in industry is just a "meta strategy", like when you leave an online multiplayer game for a bit and come back and see that everyone is using different builds. Train the muscle memory for that build and suddenly you are competitive again, inside of a year or two. Like when a prize fighter comes out of retirement for a big matchup.
Getting ahead of that and reaching a higher grade of achievement is all in doing the minimum of meta churn and instead finding the thing that differentiates if you go deeper with it than anyone else and solve hairier issues - being the artist who works with unusual media or processes.
And if you're feeling like you aren't working on the right things, then the answer is not desperation, it's to engage with some philosophy and find more logical coherence in how you see yourself, your identity, goals and how you pursue them. What school prepared you for was to know a few things about computers, not to "be a programmer". Being a programmer is ongoing maintenance, a thing of doing enough rote work to still be inside the meta. That's all.
This did not affect my career that much and I am glad I got exposure to obscure concepts but feels like I had to learn the basics a little late.
Keep coding!
I feel like I am a master of none and it really fuels the impostor syndrome.
I think some may be missing the point here that your brain sort of starts to go downhill at this point in terms of fluid intelligence. I feel your pain and hope that we have somewhere to go.
Industry moves in many directions and some of those roads come to an end.
Don't worry too much what others have done and keep progressing your knowledge with the road you have taken.
Don't be afraid to change path or direction if you feel it is what benefits you, and don't regret.
Sorry, no.
1960s - the military realizes that a single computer can not handle data from different levels of classification. (This was related to planning classified flight operations during the Viet Nam conflict, the flights themselves had to avoid enemy SAM sites (the knowledge of which was Top Secret, even more secret than the flights)), etc... and those were different levels of classification). Research to solve this problem was done, and progress was underway to build this into Multics... when Unix took off, and distracted everyone. There have been some niche secure systems available, but widespread knowledge of them didn't happen. Security of that level wasn't seen as necessary, and eventually was seen as impossible anyway. Note that the solution to general purpose secure computing was found, and proven to work, decades ago!
1970s - general purpose personal computing came along, again without security in mind. BBSs arose, along with UUCP, FidoNet, etc. in the public sphere.... ARPAnet in the Military/Educational area.
1980s - the IBM XT (or clone) with MS-DOS and dual floppy diskettes was the pinnacle of secure general purpose computing. The shareware revolution happened, and most PC users were happy to "buy" $2-3 floppy disks in bulk with various programs from strangers at computer shows, and just try things out.
Why was it secure? A floppy diskette full of data is a course grained "capability". You know (because you insert/remove them, and attach write protect labels) exactly which disks are in the system, can make backups of them easily, and it's effectively impossible to mess up your computer with a bad program.
You also had BBSs from which you could download software to try out. This was peak computer user freedom, even though the machines were slow and the diskettes weren't perfectly reliable. You could just try things, without worry. Nobody has that freedom any more, no matter what OS they run.
The Windows Era - The adoption of hard drives and GUI interfaces brought an end to users having transparent and full knowledge of where and how their data was stored. The need to "install" software transformed what was once a matter of copying a boot floppy into an impossible to replicate system state. Hard drives were expensive, and fixed... you couldn't just copy them freely, like you could with diskettes. This was the first step downhill into the descent.
Still, at this point, there were some great tools introduced at this point. With the Mac, you had Hypercard, on the Windows machines, you could get Visual Basic, or Delphi, and build applications to do CRUD or interact with custom hardware fairly easily. Documentation was included, complete, comprehensive, and amazing.
Then the .NET era happened. This made software slower, there was always a new .NET library to load, and things crashed far more often. While it might have been a good move in preparation for the migration away from the Intel instruction set, that has taken decades, not years, and the framework has been through several incompatible iterations along the way. We lost VB6 and Delphi and Hypercard along the way.
Simultaneously, the Internet was released for commercial use. Eventually, we came to have systems with persistent internet, but operating systems intended for the classroom or small corporate environment. Any thought of security was layered on top, not built in.
Then the web hit, and we shifted from high performance, easy to build and distribute desktop applications to a model where everything is shoved through a stateless protocol through firewalls and proxies to end users on machines they don't fully control, own, or understand. It's a huge mess, and it can't be cleaned up because none of the computers at the edges are secure enough to run random code.
We could fix this... and I've been trying to push that message wherever it seems like the ideas might take hold.... if we abandoned the flawed concept of ambient authority that underlies Windows, MacOS, Linux, etc... and went with one that defaults to no access, such as the ever delayed Hurd, or Genode, then it would at least be possible to get back the ability to run mobile code without risk.
Once that almost impossible task is done, then we can take the code generating tools we built for Windows back in the 1980-90s, like Visual Basic 6, and Delphi, and recast them to generate code to run directly on the phones, tablets, laptops, desktops, etc. The end user can easily manage security with the powerbox facilities that capabilities based OSs provide. (They look just like the file open/save dialogs we're all used to, but then only provide access to those files to the application).
Note that this is NOT the same as "permission management" on your tablet/smartphone.
We could be heading towards a bright secure future, where we all own our own hardware again, and things just work, quickly, without bloat, without virus scanners, the way we want them to...
or not
I think we've got a 0.1% chance for the former at this point it time. I'll do whatever I can to get that up to 0.2%