Maybe it's because of the advent of self taught engineers? Or maybe these concepts are just "too boring"? I'd like to know if I'm biased or not.
2) there’s too much to learn and everyone wants to be a generalist. I’ve spent my entire career levelling up and I still don’t know enough most days.
Can anyone comment if these newer degrees are better/worse at imparting the kind of theoretical knowledge of which OP speaks?
> engineers putting a lot of logic in their unit tests
Like is there a class which is supposed to teach these things? Seems like the kind of thing you have to learn through experience.
Regarding being self-taught... I find consistently that I generally outperform other traditionally educated engineers on my teams on a theoretical level, but I think this comes from many, many years of deep reading and small, focused experiments.
I don't even know how all of the nuance which drives sound theoretical frameworks can be imparted to a student in a 4 year curriculum, so I wager that a lot of this is taught on the job. So my next question is, what is the average developer team environment like, is there learning on the job, is emphasis placed on "getting shit done" vs "getting it done right", etc. My career has been entirely startups and independent consulting so I also have little insight into the bulk of the industry.
Anyway, I wouldn't discount self-taught engineers. If you know a self-taught engineer who has broken into the professional scene and isn't a junior, they likely have a comprehensive skillset and deep practical knowledge which converts to theoretical knowledge. Many self-taught engineers have a lifetime of experience.
I generally agree with you that software engineering today seems less rigorous. I also wonder if the internet is just making it easier to cross-examine engineering practices across the industry, and if there's always been such a distribution of people who deeply care about getting things right vs people who simply see programming as a job that in which they should invest the minimum.
Most people just care that it works and not why.
Really dangerous depending on the problem and field tbh
I felt as if they went for massive breadth instead of focusing on key concepts in depth. We brushed over data structures and algorithms (this is pretty poor since Leetcode-esque questions are now standard for entry level interviews), never really learning the theory behind them, when to use each one, their efficiency etc. No maths besides probability and basic calculus...
Luckily, all of my uni friendship group did comp sci and we love it so often went deeper into relevant topics together in our own time, otherwise I feel like I would've found getting into the industry pretty tough. I'm a machine learning engineer now and I'm basically self-taught, my degree didn't help much at all.
Courses teach a lot, but they also miss out on tons of topics as well. There’s limited reinforcement between classes, so even if your professor covers your topic of choice, the next course may not.
Unit tests for instance are generally used by students for checking their homework assignment as they go, but it’s the (incomplete) grading rubric. I didn’t actually write many unit tests in college until my software design class which was in Smalltalk.
I didn’t use version control until I worked on a few projects with a fellow student who self hosted SVN.
I used 6-8 different programming languages in school, each for a semester at a time. There’s simply no place for repetition and mastery until you get a job, either as an intern or permanent position—and then you’re the engineer you talk about.
Every new engineer is different, depending on their program and interests, even within the same school’s degree program. They’ll all need mentoring to grow. No one is ever going to graduate and be a mid to senior level purely from school.
The problem is "bootcamps" and other such businesses promising so-called six-figure salaries. I think most people who find these bootcamps appealing aren't really interested in technology. They just want money. I'm sure they wouldn't learn any of this stuff in their spare time, as a hobby, having genuine fun.
So if you're talking about fresh graduates, it's always been like this AFAIK.
Bootcamps are different but they're focused on developing attractive applications.
Choosing the right data structure is also hard to reason about when you're building small applications, on the scale of a student project you won't hit bottle necks unless you do something outrageous.
You too probably learned about it from conf talks and experience working in the industry but in hindsight, looking back, it's "basics".
These are not necessarily the same thing (depends on the school), and many people who program computers are neither engineers nor scientists in any field, let alone in computing and software.
In all fields, not only programming, if you hire people without formal training, you get people without formal training. Sometimes they will acquire the same learning organically over many years (and maybe even surpass their capital-E Engineer brethren), sometimes they won't. You may even prefer them to not have the rigidity of formal training.
Moreover, this is not a new thing, nor, necessarily, a bad thing. The Wright Brothers weren't trained engineers. Qualified engineers of the time would have considered them woefully under-educated even for simplistic design work, and yet they built a working plane when no one else had.
On top of this, if you're not flexing the "theory" muscle often, it's easy to forget a lot of the details even if you have formal education. I think this applies to a lot of us who have been out of school for a bit.
I love theory, would love to work on it for a living, but here I am.
This other day I was interviewing with a recruiter for a Go dev position, then she asked "what is a map?", couldn't believe such a basic question was being asked, proceeded to explain, then asked her why such a basic question; she said most applicants couldn't tell her what a map was (shock face).
Next question was "what is a hash table?", even more shock...
I'm a Computer Engineer who started to program when I was 12 in a 286.
My experience is that good engineers getting things done is something that happens in engineer-led companies that pay well. If you're increasingly seeing the not-so-effective engineers then it's perhaps possibly a reflection that the company is one that doesn't try to or can't attract talent?
This doesn't make it a bad place to be. All that matters is that people are happy and worklife is enjoyable.
That said, while I do believe that there’s a practical limit to how much depth of knowledge is required for most jobs, it’s perilously close to what is helpful to know. For example, while you don’t need to know how a B+tree works to write data into a database, knowing that the DB uses that (and in what capacity) can be enormously helpful when determining how to structure your table schema. Or at a higher level, you don’t need to know SQL to use an ORM, but it makes spotting performance problems much easier if you do.
How you feel about these engineers is how lead engineers feel about how you feel about these engineers.
It’s hierarchical; you are probably a senior dev, and as such you notice the mistakes junior devs are making, which is fine and good. However, your attitude towards those junior engineers, that they must have something wrong with them, is a terrible mistake that only a senior dev would make.
The inability to perspective take or recall your earlier career is the same kind of mistake, to a lead dev, as putting too much logic into a unit test or selecting the wrong algorithm is to you.
Unlike you however, a lead dev will understand this is just how people are at this part of their career, and know that you’re probably coachable and can learn to be better than this.
Could be relative, depending on level of experience, opportunity of hands-on implementation and retention of knowledge.
When I was at an interview from an eCommerce startup, they were all over the case of DB concepts and transactions. Then I was at an enterprise, where they swore by the design patterns and heavily pushed wrong patterns in wrong places because a senior engineer said so. Then I went to a hardware manufacturer, and they had zero concept of any best practices, it was bit manipulations and inline optimization all the way through.
Depending on where one is exposed for a long enough time, some concepts are lost, some are reinforced.
At the end of the day, theories don't bring any values, whatever gets the job done and brings in the money making the machine churn away is crucial.
The market wanted a lot more developers and there weren't enough immigrants to fill those positions, so tons of companies lowered the bar for entry.
Your average dev these days is just so much worse which leads to a lot of the observations you have.
What are you (you reader!) doing to support your Junior developers?
It's tempting to suggest things are getting worse though (and I am bad for this) but I have encountered so many very senior people, in some cases famous ones, who don't actually understand what they're doing. It is scary how far confidence and an effort to rewrite history to hide all trace of your previous mistakes will get you.
What has changed is the salary and expected workload are known in the wider community and so a lot of people that treat programming as a production line grind as opposed to a creative process have showed up, and totally skewed management expectations for how things could and should operate in the process, completely oblivious to the fact much of their work, if appropriately structured, should be done by the machines they spend their lives swearing at.
On the other hand, I think increasingly few SWEs enter the professional workforce with prior “casual” (meaning hobby or similar) experience. Whether or not that’s a bad thing probably depends on whether you think programmers should love to program vs. have professional boundaries.
I suspect your observation results from your improvement over time. Plus, your relative skill level monotonically increases as others retire and juniors replace them - even if you stay stagnant.
There's other causes too;
- Major tech fads (remind me - is OOP in fashion or not right now?)
- New topical emphases for industry (is is more important to take that second data structures class, or that first LLM class?)
- A changing definition of who is an engineer (should business analysts by deploying pipelines in the data warehouse?)
All this to say: there's probably more people with a fantastic understanding of software engineering today than there ever has been. And the likelihood of working with one of them might still be lower than it ever has been.
This makes it sound like a recent phenomenon. Self-taught software engineers have been around quite some time. When I attempted to attend post-secondary school (early to mid 1990s), the idea of “computer science” was still too new to academia in general - the course offerings taught little to nothing outside of what one could pick up on their own.
What other fields do, like civil engineering and medicine, is have a licensing board. The board certifies that you know how to do the job. Often studying for the exam requires covering topics that weren't part of the curriculum.
It's easy to say that there's too many variations / topics in Software Engineering to have licensing; but medicine has just as many sub-specialties and they figure out how to have good exams.
One more thing to point out: In medicine, job interviews are mainly based on mutual interest; because the licensing board determines competency. We (software engineers) could learn a lot from that process.
1) the IT field keeps diversifying at a rapid rate. Thus the schools try to teach too broad of a topic set. Thus making generalist in what appears to be specific field. Oh so you are Robotics? Is that robotics, controls, building designing, human interface, power management, AI, blah blah. To be a well rounded robotics engineer would probably take a solid 12 classes. To be good at a specific topic, would be another 12 courses per topic.
2) tight job market makes it hard to acquire talent. Your boss does not want to match YOUR salary on a new engineer as that would anger you. Thus they aim lower ...
3) the really talented new engineers are possibly not interested in your field. As such the ones who have a passion are going somewhere else.
4) the explosion of money has a LOT of people who are more passionate about paycheck then tech. Which is not wrong, but does not provide the level of knowledge you seek.
I can only assume you are passionate about tech, and thus spend spare time studying. You also delve into topics. I have found MOST techies are not so passionate. Most learn a specific topic and that is it. Tech is not their hobby as well as their job.
I view it as something to regurgitate on command when interviewed. It is otherwise worthless knowledge.
I'm not about to say it's been transformative or profoundly enlightening, but I do consider it to be useful and valuable to be able to know these things when working on projects where large data sets and performance issues are involved. I'm no longer able to nod along when people go on those rants about how LeetCode and whiteboard tests are just testing for conformity and obedience; I can now see how these concepts are important for people to know when working in the kinds of roles that the major tech companies are hiring for.
I think the bigger realisation I've had is that academic Computer Science and contemporary software development are just quite different fields, and it's non-obvious until you make the effort to bridge the gap, just how little in common they have, at least until you become more advanced as a software developer.
I agree that it's mostly due to the fact that many/most software developers these days are self-taught or taught in bootcamps or "IT" courses that just don't teach much or any fundamental CS. And I agree it's probably not a bad thing on the whole, but I also think it would be better if self-taught software developers were encouraged to believe that learning these concepts at some point is achievable, worthwhile and possibly quite satisfying and confidence-building.
It's not required to be employed though.
Go try to build a list of basic universal and theoretical concepts and I guarantee someone will disagree with you about something on it. Someone else can just as easily declare some knowledge that you omitted as basic or that some knowledge you included is not basic. Now that you've gotten something wrong, should we be questioning you because you missed something as basic?
I partially understand that people might not be interested in the details of how things works. This makes them stuck on non-trivial tasks, but whatever.
One thing I don't understand - they no longer search for "interview questions" anymore before interview.
I was always avoiding such questions, because I felt the answers were memorized word by word, but now asking about basic concepts means no/wrong aswer.
I hope that's only the case in the frontend world, where things "don't matter" that much.
Back when I first got online, most people calling themselves programmers were self-taught and they usually had a very strong interest in learning adjacent skills so we had plenty of discussions about best practices and data types on IRC.
And then the mass-manufactured 6-week bootcamp "consultants" arrived and basically just used whatever algorithm shows up first on StackOverflow. Those folks are mostly in it for the money, so they won't learn about algorithms in their free time because there is no direct link from knowing them to a salary increase. They also rarely participate in group discussions, but instead jump in and interrupt everyone else to ask for help on their specific problem.
And that's why some of the old self-taught people now look like 10x engineers (in comparison to those bootcamp consultants).
I do not work at Amazon.
I think what I'm saying is a large company like Amazon has figured out how to get by with a small subset of "historical" CS knowledge. Maybe .edu is responding to industry needs by only teaching kids the three or four concepts they need to pass an interview at Amazon.
I doubt that it's this. For much of the modern history of the software industry, almost all engineers were self-taught, but this is a more recent phenomenon. And it's one I've seen as much from college graduates as others.
Personally, I think it's because a lot of software engineering is climbing up higher of the ladder of abstraction. Having a career as a dev without understanding the low-level basics of how computers work is entirely viable now. Just a couple of decades back, it wasn't.
Engineers, especially software engineers, are far less frequently self taught now than 20 or 30 years ago. There's far more to learn, far more opportunity for formal education, and (don't laugh, I'm serious) the level of professionalism required is significantly higher now. When I was a kid you could get an entry level programming or web dev job on the strength of building a web site over the holidays. Now, most places you need some sort of formal qualification to be an intern.
It takes a while to develop a fascination for the internals of programming which comes after you start building things.
The problem is that a normal person would look at the industry, look at what is the most popular, and think: this must be the way it should be done.
But pick any point in programming history and you would then just be learning the latest fad.
And you can only know how bad something is until you understand it completely in both theory and real-world experience. Only then are you adequately able to criticize it. So when the new hyped thing breaks through to adoption, its then at least 10 years until people can adequately point at the holes in it...and see which stick around and which get fixed.
Yet every single new thing, people still convince themselves that this is the way of the future forever.
I think an unbiased History of Programming Languages is probably the best thing people can learn. After learning a few modern languages of course. Then you can see why things are the way they are.
It's funny to hear people criticize frontend for churn - which is true. But at least the language has stayed the same. Backend is actually incurring a lot more churn in not only the libraries and frameworks like in the frontend world, but the whole programming language too.
The people we hire now don't seem to have learned anything they weren't forced to learn in school. The profession is becoming more blue collar but much of what we do isn't routine.
One should distinguish between computer science and SW engineering. The former is full of theoretical concepts, and the latter consists primarily of opinion. I know SW attracts people who like hard boundaries/divisions, but amongst the engineering I've been exposed to[1], SW is the least cut and dry.
Data structures is CS. Unit tests is SW engineering. The former is as clear cut as math, and the latter is a relatively new introduction that has widely differing opinions amongst its expert practitioners. Of the senior, good SW engineers I've worked with, you have some who are pro-logic in unit tests. Others who are anti-logic. I fall in the latter, and I counsel young developers that way, but I won't ding a senior developer in a code review for it. I am no more right than he is.
Data structures: The wise engineer knows when it matters. A very popular C++ book advocates vector usage even when theory says a map/set is better. Why? Because vectors have been optimized like crazy, and for data sets of a certain size, the vector almost always performs better. So he recommends getting data of representative (or max expected) size, and benchmark.
Have you done that for your code?
Even if it would be faster, how much of a bottleneck is the existing code? If only 2% of the time is spent on this portion of the code, does it matter if you speed up that 2% ten fold?
[1] My background is in one of the "hard" engineering - not CS.
All good engineers are self-taught whether they went to university for CS or not. Almost everything required to become skilled or knowledgeable happens outside the classroom. If you don't have an interest or passion to be deeply competent, it won't happen through osmosis because someone else is teaching you, you still have to do the hard work and experimentation on your own. Mediocrity in software engineering has been the norm as long as I've been in it.
Also, "self-taught" engineers aren't new, and some software domains like hardcore systems software seem to be mostly self-taught. In the past I think it was even more prevalent due to the dearth of easily accessible information and learning material. Blogs and tutorials weren't a thing, you had to learn a lot of things experimentally, which often had the side effect of much deeper understanding than if you'd just read about it. The massive increase in available software engineering knowledge is a huge improvement but it also reduces the need for understanding the material. If you can google a solution to a software problem you don't necessarily need to understand why or how it solves a problem.
It matters far more now what nationality, skin color, and how many studies you've published, quality or not. You can guess which political party decided to make it that way.
Computer science is a theoretical discipline, ideally suited to being taught in an academic setting, and concerns itself with the study of computation, including rigorous mathematical proofs, scientific experimentation, and complete thought experiments like super-Turing computation. It can sometimes offer insight into thorny programming problems, but for 90% of programs isn't directly applicable. Where research results from CS do become applicable to programming, they're typically implemented into libraries that can be used by software engineers with no deep understanding of the underlying research. Many subfields of computer science don't require the ability to write executable programs at all, though applied computer scientists are valuable to bridge research with the world of software engineering.
Software engineering is a practical discipline, ideally learnt ‘on the job’ in a self-taught or apprenticeship structure. It concerns itself with how to make good programs in a corporate environment, and in addition to practical programming concerns itself with other hard problems in building reliable software: testing, requirements analysis, and team collaboration dynamics. It's very rare for software engineering tasks to require knowledge of computer science concepts, though occasionally software engineers might choose to implement some CS research.
When people mix these things up, chaos ensues. For example, interviewing software engineers based on theoretical knowledge of algorithms, or demanding a CS degree as a prerequisite for your software engineering job, is largely useless and a terrible predictor of the quality of software they'll output — in the unlikely event that they need a sophisticated algorithm, a good software engineer can pull in a library or, in the worst case, search for it and translate some pseudocode. Likewise both disciplines end up getting maligned for not being the other — I've heard people both bemoaning that self-taught programmers can't balance a red–black tree (why would they?) as well as grumbling that computer science graduates learn weird languages like Haskell and Prolog but can't write a Python unit test.
Furthermore, because companies expect software engineering skill from computer science graduates, computer science degrees start incorporating more software engineering material at the cost of theoretical computer science material — and they inevitably don't teach it very well, because practical skills are better learnt on the job. Theoretical skills for research positions are increasingly shunted into masters or PhD levels. If we're not careful we'll end up with the worst of both worlds, where CS degrees are useless both for theoretical CS research and for software engineering, but for historical reasons are considered entry-level requirements for both professions.
tl;dr Companies should stop expecting CS degrees and knowledge for software engineering jobs, where it's all but irrelevant, and provide better on-the-job training and apprenticeship schemes. Universities should stop turning their CS degrees into software engineering degrees, and stick to teaching theoretical CS — because universities exist precisely to teach those things that are hard to learn on the job, or not immediately relevant to professional practice. If they do offer software engineering education it should be in a dedicated SE degree that can focus on SE theory, around e.g. organizational structure and quality assurance. We don't expect physicists to be able to design and build a house, or architects to answer interview questions about quantum field theory; why are we so determined to erase the distinction between theory and practice in the case of software?
It's easy to teach specifics. It's hard to teach people how to find them. It's tricky to teach people when to start looking; and most of the biggest mysteries in software development can be boiled down to the elusive "why".
In my experience, the biggest confounding factor is "secrecy", a word usually written out as: "proprietary".
There are two approaches to learning: definition and inference.
A public system, like Linux, can be learned from its roots, each definition built on the last.
A secretive system, like Windows, must be inferred. Every piece of knowledge is limited to the "best guess", and whatever conditions that guess was tested with.
Unfortunately, most people are working with secretive systems, so inference is the only strategy they get any practice with. That's the wrong strategy to apply when writing software, because - at the very least - one can read their own codebase.
Inference can teach you something close to specifics. It can tell you "when" to start looking (now and always, because your testing conditions are fragile). Unfortunately, inference can never truly answer you, "why".
Remove the secrecy, and everyone will know to value the basics, because those basics lay at the foundations of every answer.