* Database Systems (relational algebra, SQL)
* Concurrent Programming
* Network Programming
It seems most are exposed to them partially through project work but without the base knowledge.
Is this typical for CS undergraduate degrees because you get to pick your own classes?
People/companies commonly treating both the same is IMHO one of the major problems of the current industry.
None of the topic you mentioned are fundamental to CS.
They are fundamentals of software development.
Wrt. to computer science they are at most specializations and even then what you might do in a science context of them might differ largely to what you would need to use them for production focused software development. Through they do contain some fundamentals like, e.g. set theory in relational databases and graph theory in network programming and concurrent programming.
You can (rightfully) have a master of computer sience _without having ever written a single line of code_. And going back ~20years that wasn't even that uncommon.
Now today a lot of universities have realized that this mismatch causes problems and are also teaching the fundamentals of software developments additionally to the fundamentals of computer science. Additionally of lot of computer science today requires the use of tooling which requires some programming and SQL.
Still what the "fundamentals of software development" are is a much less clear topic then the "fundamentals of computer science" (and even there people disagree all the time). And for example "relational databases/SQL" is one of the thing people can strongly disagree on weather its foundational to software development or not (anymore).
Learning how to use SQL is more of a trade school course.
I have a BS, Bachelor of Science. The foundational classes were math, math, math, math, and more math. There were no classes in how to operate a machine tool. I befriended the guy who ran the machine shop that built apparatus for the scientists, and he taught me how to run the machines. But that wasn't a class, it was just something I did on my own initiative.
The more important question is: Are you explicitly mentioning databases, concurrency and networks in your posting? If not, then it explains why candidates are not filtering themselves out.
Why don't they know database systems? They might have taken a database course for 4 months 3 years ago and never touched a database again because it's not a trade school. School just validates that you can learn a series of related skills over a few months when necessary.
What's the last thing you started and dropped after a few months just before the pandemic? How comfortable would you be if you interviewed for a job exclusively on that skill?
21 year olds won't know very much of anything in general.
A 21 year old 3rd year college intern is... 21 years old
Three quarters of a four year computer science degree doesn't change the fact they're a 21 year old.
Even in the topics they have covered, the knowledge won't be very deep.
A true mental model of concurrent programming is not something easily obtained.
Frankly, most 35 y/o engineers don't truly appreciate the intricacies of intra-thread concurrent algorithms, unless it's their specialized area.
Frankly, most engineers in the industry are too lazy to learn SQL well.
Lower your expectations of 21 year olds. Lower your expectations of the workforce in general. Hackernews is a self selecting community of tech works who study their job in their spare time as a hobby.
Most people I've worked with go home and watch football after 5pm.
I graduated with my Bachelors in CS in 2016, and those classes were optional senior electives. You were required to take a certain number as well as some required ones (i.e., Computer Architecture, Analysis of Algorithms). I chose to take Database Systems, Data Mining & Machine Learning, Robotics, Computer Vision, etc. as electives but not Concurrent Programming or Network Programming because I already felt comfortable with those topics. Others chose classes in topics like mobile application programming or programming language theory.
Those topics may be foundational for you, but not for others.
Interns / juniors have little to no practical experience, and practical experience is where you _really_ learn how to program.
I think there is room for innovation in CS/SE education. Imo some sort of "code review" class where students analyze and report on a bit of code would do wonders for interns / juniors ability to onboard quickly into their first job. I've written about this in the past [1]
[1] https://sophiabits.com/blog/the-one-change-id-make-to-se
For example several first year students not only had no familiarity with calculus, many were having a hard time with algebraic concepts. Concern was around finding some way to add a remedial math course before Calc 1. This pushed everything else out farther.
CS grads may lack foundational knowledge simply because high school grads lack foundational knowledge. What it took to "pass" high school 30-50 years ago seems substantially more rigorous than today.
The administration didn't care. They didn't even care if the students dropped out. Butts in seats = more money. That's it.
I would suggest looking at the CS departments page from whatever university you mainly recruit from to get an idea what the core of their program looks like.
Perhaps the problem is that we train so many computer scientists to do programming, but that's the learn-on-the-job part I guess.
I met version control and unit tests on the first day of my first real programming job, because I studied physics, math and computer science. And I'm still (by far) the go to person in the office for all matters related to plasmas or cosmology.
All the things you mention are fairly big topics, and not only that, you only really understand them by doing a bunch of coding over several years. You can get introduced to them in university, but a degree course is only so many topics and they all need an introduction. Chances are if a student has done these things it's superficially, in practicals that are similar to practicals in other sciences: you don't really understand it, you write it up, and then you don't rely on what you did for further studies.
I studied a bunch of things at university, leaving without being particularly good at any of them. For instance I built a radio and a bridge in my first year on the engineering degree, but I couldn't just become an EE or civil engineer from that. I wrote a thesis about early WiFi for the business school, but that doesn't mean I could just be a product manager.
Similarly, a student may have done a bit of joining tables in SQL, a bit of multithreading, and a bit of routing during practicals, but you wouldn't think they really understood any of those things in the way someone with a couple of years on the job would.
However in my experience most of them pick things up relatively quickly and end up becoming good engineers and reliable team players.
At a BSc level with no experience we should be looking for genuine curiosity, motivation and interest in learning and solving hard problems.
Everything else can be taught.
We have even expanded our program to include interns with degrees in other areas of science such as mathematics, physics, materials and mechanical engineering, some with very little programming experience, with great results.
Our objective is always to hire them and retain them, so we do invest plenty of resources in training them well.
My curriculum was: Year 1: Intro Comp Sci / Year 2: 2 courses in Logic, 2 courses in data structures and algorithms / Year 3: 2 courses in processor design, 1 course in finite state automata, 1 course in parsers / Year 4: a course in ethics, a course in team programming (which covered UML and version control), and two electives.
I believe a major was 14 courses so I'm missing one, or it may have been it was three electives. I didn't take databases because I was already a paid sysadmin before I started college and mostly at the time database courses were just ten tedious weeks of normalization crap.
Also, treat your interns better. The reason to hire interns is because you plan to devote some of your resources to help them in their professional development. Stop asking what you can get out of your interns and start asking how you can best give something to them.
This may be different in different universities and it may have changed today, but the tests we took were largely about remembering lecture talking points and being able to regurgitate them with or without any real understanding.
For example, you might learn a bit about relational databases, but your understanding will be limited to the talking points of the lecture. Eg, you might get question to explain the use of primary keys, but if you were asked them how you might design a relational database for some data with normalized tables they'd have no idea, because they'd never have actually put the talking points into use.
It upset me because by the time I had finished university I had launched two startups and worked professionally as a developer for 3 years. I was consistently helping students with practical exercises while at uni given I was one of the most capable on the course, but none of my experience really helped me in the tests because I discovered so little of it was about practical understanding, and mostly just an extended English exam tested mostly on writing ability and being able to regurgitate talking points in lecture slides.
It's not that the students weren't smart or capable individuals, its just that the course didn't incentivise obtaining depth of knowledge in what was taught so no one did.
SQL? Yes. Database theory? Have never discussed it beyond "what is an index"? So I never looked at that topic again.
Concurrent programming? Never dealt with it outside of courses and jobs that care about it mention it, so I self selected out. So I never looked at that topic again after the course.
Network programming? Took a course in it, but outside of a few devops use cases, I have never had any reason to recall that knowledge. I just memorized 5 versions of that test and went in to it with that.
My advice to my undergrad self would be to basically abandon anything that is not fun projects (so you get familiar with the languages themselves), hackathons (so you have culture fit), and leetcode.
I can't imagine the average ROI on learning these things is great.
Second, I don't know what part of the world you are in, but usually formal education courses have certain requirements to them, roughly x1 hours of social, x2 hours of humanities and so on. Then there are basic prerequisites like math. In the end, the final number of hours for subject is not that high as it would seem at first glance.
Finally, there is competition among education providers and their "tiers". Universities/colleges compete not only among themselves, but against codecamps too. The premise of code camp is to help somewhat computer literate people memorize a bunch of text macros that yield certain result on the screen. Colleges must adapt to compete, dropping the quality floor even lower.
In the end, unless you have graduates from "general" college/university you can expect that deep foundational understanding will be replaced with quick factoids on how to produce certain result in certain specific context without understanding said context or even being aware of said context.
Greybeards looked the same at "us", by the way.
Also what network programming means to you?
Basic networking knowledge or actually writing low lvl network code?
Besides that: higher edu institutions suck, unless it is something like top3 then do not expect a lot just because it is a degree, everything is up to the person.
The rest were mostly fluff. Interesting fluff, and good background material, but very little useful stuff.
The interesting things were available through student societies, and if you managed to get to work at the University's IT department.
Your typical liberal arts degree has 4 years; about 2 of them are dedicated to liberal arts, or core education, and two years to the major, so you only have, say 16 classes for a CS major. Of those, say 4 are math, 4 are the intro programming sequence, and 3 are the architecture sequence (circuits, architecture, and operating systems), so you only have 5 or so classes, for your foundational classes and cool electives.
What normally happens is that, depending on uni and student, students will take some of them, but not all of what you call foundational knowledge. And the same with what the next person wants ;). Your foundational knowledge doesn't include AI, Data Science, cloud, mobile , graphics, UI, ...
Learning more theory later on is still possible but those are more like financial investments that give lower yields over longer time periods. So they are best done "early" in your personal development.
In terms of interviewing interns, just find out what they do know, and judge the best one on a balance of talent, knowledge and people skills. They will do you proud. No need to have a set expectation against specific skills unless that is the core domain they'll be working on.
I never took any database modules, I am entirely self-taught on SQL and MongoDB through some side projects.
I never took any concurrent programming courses (not interested).
I only took one network course and quickly forgot everything except there are 7 layers in OSI model (or is it 5?)
Needless to say, I never needed any of those knowledge in my work as a frontend engineer. Even if it is needed somehow, I could just fire up MDN/Wikipedia or ChatGPT and ask.
What I did take are a lot of software engineering and AI modules, and I found them to be more useful or interesting.
This might be especially true for theory/fundamentals. It's easy to skip that stuff if your program's focus is on immediate job-readiness training.
I actually would love to know more about network programming. Specifically, "how do I go from electrons wiggling in a wire (or radio waves in air) to the TCP stack"
I have had hints of this - the OSI model, wireshark, etc but when it comes to figuring out the nitty gritty of networks I feel like I'm stabbing around in the dark. How should I configure my networks on AWS? What's the best way to get VPC's talking to each other safely?
At an interview with a former (excellent) boss I was doing an exercise and set up a REST API. He said one simple thing - "ok, but why HTTP?" and suddenly my very, very faint memories of netcat came to mind and I ended up making it ~10 times faster but just putting bare messages on TCP instead of using HTTP. (I'd have used a low overhead protocol in a production situation). But I want that sort of idea to come naturally.
Should I just take the AWS cert courses or is there a better way? I want to have more than just "uhh netstat -peanut and grep for stuff" to figure things out.
Similar, how do the internals of linux work? I can generally get what I need to do, done, but I only knew to check load average after a colleague mentioned it.
Everything I've learned is top-down; I think I'd like to learn bottom-up.
To put it this way: I'm just about a decade out from when I finished my BS degree, and my CS courses were (in order):
1. A SICP-based intro course using Scheme. I loved this course!
2. A data structures course using Java
3. A machine structures / light hardware design (just to understand pipelines, caching, etc) in C and assembly
4. Now pick whatever CS area you want to study
I mentored a new grad from my uni who is just graduated, so is 10 years younger than me roughly. This curriculum was changed to:
1. Same SICP-ish course, but using Python
2. Data structures was cut shorter to make room for ML
3. C++ course to build some kind of "distributed system" (but no discussion of the fundamentals of how the ABI works, for example)
I had to explain very basic things to this guy (e.g. what the stack was, basic gdb usage), who was otherwise very bright.
That's not to say "the kids these days are all dumb!", as I have TAed some classes where some of the students were far better hackers / coders than I was, but I just think the funnel towards skills and technology that have Proper Nouns that can be put on the resume is an unfortunate pressure being placed on universities these days.
The "foundation" depends on the context. Your process may be different but many companies will consider D&A as foundational knowledge as it is the closest thing we have so far to a standardized measure.
In my country you get to pick just a small percent of classes and the foundational ones are mandatory for anyone.
>Database Systems (relational algebra, SQL)
We did shitload courses on databases in both undergraduate and graduate programs. Not my favorite but they were useful. There's now way to not deal with databases as a programmer.
> Concurrent Programming
Did that and parallel programming, too.
> Network programming
Did that as a subset of Operating systems course, where we had to tackle many aspects uf Linux systems programming
We did a lot of other courses that were at least just as important. Algorithms, Data structures, 3D programming, Testing (forgot how the course was called), Formal Languages and automata, Data mining, ML, AI, Digital circuits, Cryptography, Big data, Cloud, Complexity, Web, Semantic Web etc.
I don't know about CS graduates, but I've seen with 5 to 10 years of experience lacking basic skills such as commonly used sort algorithms and time complexity. Their justification was in lines of: "I know JS and TypeScript and React and I don't need anything else".
I still remember being asked ridiculous questions about design patterns and UML as a new grad. Stuff that is never taught in a typical CS degree - but interviewers seemed aghast I didn't know them. I still remember one saying "but you didn't learn design patterns?!".
I think the only things you should expect from new grads is ability to code, basic understanding of computer architecture, and possibly data structures / algorithms.
We learnt the fundamental concepts plus do some programming, e.g for database: relational model, normalisation etc and coding in Java/PHP with Postgres/Oracle.
Our curriculum is pretty generic, and most of the alumni work in the industries as software engineer, IT consultant, startup founder, etc. Not many do academic work.
I'm not saying that all of us understand the basic concepts well, though. Some maybe only read a bit of theory and spend more focus on writing apps, instead.
I've had UCB/Cal State students say things like "Why would I ever need to interact Excel when there's X?" - uh if your boss or execs use it! I recall the "Hadoop big data expert" who couldn't figure out how to do a VLOOKUP during an open-browser interview, or folks ready to graduate from a data bootcamp who couldn't give even one example of how they could distinguish between plausible and accurate data. Also if any local programs teach students about the ascendancy/utility of PowerShell for anything IT/cloud/admin/DevOps-related it's not trickled down to recent applicants.
I've been doing interview of freshers since long.
Sometimes they'll know a lot of things, sometimes they can't tell a computer apart from a file cabinet.
One can't predict what a person knows based on their degree, yes it does give you a general idea but that's not the case each time.
Moreover, there are various other factors that you should consider while doing the interview, if they are freshers, like they might be nervous, they might have travelled long and came to the place, many many factors.
Looking back on it now, a lot of the courses could have been replaced with something a bit more practical, e.g., how to use Git. But strangely, some of the larger team projects made all sorts of assumptions around being able to code, knowing about networks and networking etc. which were never explicitly taught in the course.
No concept of processes, no idea about any data structures, intimidated by everything. They've studied C/C#/Python in a class but can't remember anything about it. It's really a lack of passion and interest and it's endemic I think. People study CS because they've heard it pays well.
I fully expect these people to become my managers!
For example, my concurrency course involved a lot of formal methods and temporal logic. Because the teachers were doing research in that area. In retrospect, this was all a bit academic and not very practical. This stuff does not map very well to the real world when you are actually trying to solve some concurrency issues. But enough of it stuck that I was able to read up on this when I needed to.
And of course, some of that academic stuff actually works out in the real world once in a while. E.g. the java.concurrent package is a straight integration of a framework that came out of the academic world and very nicely done. Great stuff and I still remember pulling that in as a separate maven dependency before that was integrated.
I think of university more as a place where you learn to learn new things rather than as a place where you actually learn a lot of things. Mostly becoming a good software engineer is basically an apprenticeship. I was lucky with my early gigs and I learned a lot on the job from more senior people. I've worked with lots of people with either no degree or a non computer science degree as well. It's fine as long as they can learn new things.
Do an experiment: pick a handful of intern applicant resumes, look at their degree and then look at what the uni requires to get that degree. In the US, you will find often that there are ABET accredited and non-accreddited versions of the same degree. What you are looking at is commonly required for ABET but often avoided when possible. Or in the case of one person, I've found that the college they claim to have studied at doesn't have any such degree program.
Reminds me of a conversation with a new hire that was re-orged onto my team, and a machine learning engineer on another team. She said "Ah, you are lucky to join his team this way; they are all the people who took those hard classes like Operating Systems and said 'more please' instead of quickly dropping the course. Their interview process is very hard."
It's also worth remembering that the US BS program is not like Germany's old Diplom system. It's not 2 years of gen ed then 3 years of only computer science. It's 4 years total and entirely possible for a student to study SQL for one semester two years ago and never touch it again as part of coursework, if they took databases early and don't have a school job.
My impression is that the philosophy of CS curriculum at most places is to keep the barrier of entry low. That's why bootcamps are a thing, and my fellow CS grads are excited about ChatGPT being able to write code. On the other hand, there is a growing shortage of electrical engineers because the level of gatekeeping is too high.
I am an electrical engineer myself and I have working knowledge of all the things you mentioned.
I think I had one course on "parallel"(concurrent) programming. I had exactly zero courses on SQL or relational algebra, although we did write a basic database. I had one course that MIGHT have touched on a bit of network programming.
So, academically, I would have had little or none of what you are considering is "foundational" CS knowledge.
I actually did learn SQL during a job I did during school, and I was fascinated with race conditions and mutexes so knew a fair bit about the theory of concurrency. And I loved learning about all kinds of protocols, like SMTP and Telnet and FTP, so I knew quite a bit about networking.
I think the disconnect here isn't that "students don't learn the fundamentals", it's that you think the fundamentals are what they need to "do work" on day 1. Back in the day, University was supposed to train you to THINK in your field, and so things like data structures and algorithms, yes.
And, I have to say, it has done well for me. I started as a C/C++ programmer, then moved to Java, worked in PHP and have now been writing a lot of Python. I've helped with C#, Visual Basic and a number of other languages.
And if tomorrow I need to use Go, or Rust, or $whateverLanguageOfTheWeek it is, I'll be able to. I understand what can make a program run fast or like a dog.
If someone can do all of the work through 3/4 years of university, they should be able to learn and do a large variety of tasks.
Nowadays, it might be that a CS program requires some or even all of the courses you mention in their study. But do you expect someone who took this for 1 of perhaps 20 courses to be an expert at them?
There are lots of "foundational knowledge" materials out there you can pick from the realm of "CS" - there's a theoretical track, practical track, grad school track, "just enough CS to be CS while I have fun doing other stuff" track ...
If it's vital [to you] that an intern (which, most often, is defined as someone you'll only have for a few weeks or months and who is distinctly still in school) have all that "foundational knowledge", then put it in the job posting :)
Most internships I've ever seen have far fewer requirements (because the company is going to do some amount of training for the interns during their tenure). They're more along the lines of:
- Jr majoring in And ... that's it I knew "a lot" coming out of high school - more than ended up being covered in pretty much every [core] class I took over the next couple years Didn't make me at 20 comparable to someone who'd been doing the work professionally for a decade or more :)
Be aware that university today is largely unaffordable to most. I don't mean that as in "it's expensive". I mean that literally most people don't qualify for federal student loans, and cannot afford the payments on private loans. Yes you have to pay on private loans as you go to school. Those payments based on a percentage. Which has obviously massively increased since 10 years ago. Schools don't have jobs for students that are not in "financial need" (parents income). Therefore you have to work a job unrelated to your field just to go to school.
Good engineers find out quickly that university is much more about the fed paying itself at your expense. The system is corrupted by politics (the blue kind) with accreditation clearly influenced by whomever sits in the chair. It's not about merit, it's a gravy train for gov workers.
Look elsewhere for employment.
I chose the "traditional" stuff, like operating systems, compilers, did my databases course with a lot of interest, embedded, and then wrote some software in my free time (beammp.com's server, which is a game server for a multiplayer game, lots of concurrency and network stuff), and other side projects, a lot of stuff from scratch, and a good lot of open source contrib.
A lot of my peers don't do their own side projects, and if they do, its a website or something ontop of layers of abstraction (like a CRUD app in TS).
A different subset of my peers chose specializations in which they barely need to know what a variable is, such as security related courses which are more law than CS (e.g. forensics), some who go for "fullstack" webdev almost exclusively (no interaction with hardware, DBs only through abstractions with no sql in sight, abstractions of abstractions and a lot of copy paste).
The few of my peers who are interested in CS to the degree that they enjoy learning the foundational things share my view, and I often talk to them about this topic.
I dont mean to be dismissive of those other disciplines - they are valid and the depth of knowledge that can be acquired about, say, TS, is not something im questioning.
To me, its just not that CS-y to write html, I feel that close to the hardware is where the programmers with the degrees should be. In my experience as a webdev you get outpriced by third world country developers very easily, and thats a tough spot. Not so much in, say, robotics.
Engineering is a craft you learn under the tutelage of peers and masters, i.e. at work or perhaps in a community (e.g. open source). That is what you should expect to provide to your interns. They're not cheaper, fully-formed labor.
Since then, I perceive that CS programs have come under immense industry pressure to crank out software engineers and not computer scientists. The need for real computer science is not very large as a percentage in the entire computing/software industry. And generally once a funky CS problem is solved, it gets encoded into software and then reused by software engineers over and over.
I think this has forced CS departments to start directing some of the curriculum towards software engineering rather than CS -- resulting in students who are more diluted in the fundamentals, but who also aren't great software engineers.
I also think that some departments are also now trying to figure out how to support the massive growth in the kinds of computer science theory needed to support the emerging data science/machine learning/deep learning/etc. fields. When I graduated GPUs were still relatively new and we simply didn't learn those until later MS programs. Now I couldn't see somebody graduating undergrad without knowing some basics on how GPUs work on the hardware and software side, and understand the data structures and computations required to work with the new classes of models emerging in the field.
If you need someone to write a network stack and a concurrent database system, you don't need an intern, you need an engineer.
Interns are generally students. Sometimes they're still in school. If they had skills in these kinds of deep topics, they would be applying for full time engineering roles, not internships.
People apply to internships because they lack experience and skills. It's your job to mentor them and expose them to these kinds of specialized roles so that they can decide in what direction to start their career. Interns come to you to learn and gain experience they couldn't otherwise.
Interns aren't workers, they're apprentices. Once you treat them that way, you'll have a much more rewarding relationship. They're there for you to teach, not to do an engineer's job for a fraction of the pay.
Nearly all of them are strong in the areas of design and software engineering principles. They are also strong in SQL, the Git/GitHub ecosystem, and at least one language/framework. And they know just enough about infrastructure to use a simple CI/CD pipeline. But anything beyond that infrastructure-wise (networks, security, tiered architecture, IaC, services, PaaS, etc.) is foreign territory for them because it isn't covered during a typical undergrad program.
Now this is perfectly OK in my opinion because I provide that for them initially and teach them many of those concepts while coaching their projects. There is limited time in any undergrad program, and I'd rather universities spend that time focusing on what they are currently focusing on because that is a foundation I can work with.
In general, however, fresh CS grads from even good universities need to be onboarded on software tools and certain types of systems when they have their first software engineering jobs. For example, many college students don’t know git. Basic sysadmin and more advanced Unix command line skills are also generally not taught.
Personally, I still think that you should be exposed to those for a computer science degree. Here is why:
1. Universities have, and maybe always have, evolved to meet the demands of the labor market and most jobs will touch on those topics in one way or the other.
2. Generally, you need to persist data, whether it's on a drive, in a data store and so on. Having heard of different data stores and maybe differences in query languages, seems very relevant. This doesn't mean you know how to write your own database.
3. Which processor these days does not have more than one core? Even in languages like Python or Ruby, data races can result in subtle bugs. Having some idea around that maybe access to shared resources needs to be protected is useful.
4. Whether it's writing code in microservice architecture or integrating an API, we make network requests. Having an idea how this might look different, for HTTP, TCP or UDP provides a lot of context to make better decisions.
To me the idea of a formal degree, is some level of exposure, so that even when you have never touched those things years later, you have some reference in your head to start looking it up and refreshing your memory.
I also agree, that a computer science degree doesn't mean you become a software engineer and so it might not make sense to force everyone to take those classes, but then again, see point 1. Alternatively, which classes would be more suited or a viable alternative to those if you have to make a choice?
The various software engineering modules I had were pretty lacking and flawed in their teaching and I could tell that at the time and even now with several years on industry experience I still believe they are lacking foundational areas especially the ones you identified.
For example:
- Databases were barely taught or even used. We had some fairly poorly put together “web” module that covered PHP, a tiny bit of SQL, no HTML, CSS, JS.
- Networking module was “here is a Cisco CLI, setup RIP following these instructions, congratulations you are now network experts”. I think we covered NAT in one module.
It seems to be a deeper issue too. I remember that the vast majority of people in the software engineering modules couldn’t not just write code - they’d never even so much as attempted it.
This was in other modules too - a majority didn’t know what operating systems were beyond “what, there’s things other than windows?” all the way to being for all intents and purposes tech illiterate.
To the point I remember a second year teaching another second year what copy and pasting is.
This was around 2012-2015.
So I don’t know where the blame likes really, I think it’s all an amalgamation of:
- Clear lack of interest or passion about anything in their degree of study (why the hell sign up for it then, with so many people doing that?)
- Completely failed tech or STEM education
- Total failure to vet applications to the university
- Whatever STEM GCSE or A levels they had gained clearly not being up to scratch
- Some modules being so dumbed down as to be meaningless or something you couldn’t gain equivalent learning from googling for a few days
I have noticed that undergrad CS curriculums in most places are shying away from making practical skills part of mandatory courses.
Sure, theoretical underpinnings are ‘more’ foundational. But unless you know how to use these (and similar) tools, there is simply not much motivation to understand the theory naturally.
Lets compare with a graphic library; you can learn the DrawRectangle API without learning the DrawLine API.
Not so with Concurrent. Some kids won't understand the debugger (if any LOL) Some kids will get stuck at coding. Some kids will have weird conceptual hangups. All the kids need to learn all of it all at once. Its a big chunk.
Same problem with Network. Pre-reqs are seen my uni admin as a way to restrict tuition income, LOL. You can't assume the kids are up to the level of a Net+ cert or even an A+ cert. Its a big chunk all at once. See above, you "probably" can't do network "correctly" unless you async it. Does the class system even have a pre-req that includes the concept of a TCP/UDP network port? Its just a HUGE topic to inject all at once.
When people say DB they are usually unclear. I took a senior level DB class and learned Codd-Normal forms and all that which makes SQL query writing pretty easy. The problem is your interns aren't graduated seniors yet and they probably eliminated that class from the curriculum because "No-SQL movement" or whatever new hotness caused a distraction. I think you'd be a better logical thinker about organization of data if you know your Codd-Normal forms or have at least been exposed to the general concept. But other kids want to learn mobile app dev or frontend or whatever new hotness.
Finally a lot of uni work is filtering. Must be smart enough to learn up to this level even if you never use diffeqs again. So the kids you're interviewing don't know anything useful, but they're smart enough to learn, and it'll turn out OK, probably.
There was some depth but it wasn’t geared towards praxis. It was much more geared towards “base knowledge”, and the curriculum looks much the same today. SICP is still popular.
I happened to take a networking class where we learned sockets, but most people didn’t.
The core upper division CS classes were: OS, algorithms, compilers. Then people may take DB, graphics, networking, more algorithms, UX, “software engineering”, or maybe some HW classes.
Another thing to consider is where you’re recruiting from. Berkeley/Stanford/MIT will be somewhat similar but San Jose St or other state school will be much more focused on teaching python, SQL, C++, etc.
This all is to say: as a matter of curriculum design maybe we can train future software engineers better, but when presented with an individual candidate consider not only what the candidate knows now but also that person's ability to learn.
Even if it doesn't sink in while reading - and it probably won't - it will make the lecture make so much more sense. You'll learn much faster than you would without the preparation.
In my view, those classes and the education in general have exactly the same role with respect to your first job. They won't sink in fully during those four months, but they will provide context for the experiences in your first job and allow you to gain experience much faster.
In other words, view that new graduate as somebody who hasn't covered the material in class yet, but has done some reading to prepare.
These are not required courses in the CS curriculum of most schools in the US. Elective yes, but at some smaller schools, these are not even options.
Have you checked out the degree map of some of the schools? (Even the most well known ones) And you will see it.
1) different colleges have wildy different expectations and learnibg material. I went to a very small school with a high waulity cs education. I was amazed by my friends that went to bigger schools, id often have to study 3x more than them for the same grade, and their course material was basically constantly 1 semester lagging.
2) youre interviewing juniors. Major concepts in CS really started to "click" for me and my cohort towards the end of junior year. I think this is about when you have enough exposure to really start to grok big fundamental topics
3) a lot of students do not learn the fubdamentals and just hobble along to get the degree. I couldnt believe some of the shit my friends didnt know after getting the same degree as I did. Stuff like they still didnt understand pointers, couldnt explain tcp vs udp, etc.
But they're not upset when some interviewer tells them to do Leetcode whiteboard performance art, for pieces they've hopefully memorized but will pretend to be approaching for the first time, as a hazing ritual, and the interviewer, briefly feeling in a position of power normally denied them, says, "I want to see how you think", as if the interviewer can actually discern that.
People are accustomed to having their ducks lined up for them, and they just have to do the things they were told, and then they get the big paycheck, and they get to be the one hazing the new pledges. You're not playing along with the system.
And your list isn't at all what I expected it to be, all pretty practical 'application' stuff. I think that's largely the 'problem'.
> Is this typical for CS undergraduate degrees because you get to pick your own classes?
I don't think it makes much difference who chose them (student or programme director) - they'll be different at different institutions (or among students at the same institution) and not all of them will line up with your own education or opinion of what it should be.
If you absolutely need those things, then make it clearer in the job description. If you don't, then why not ask what they have studied (or most enjoyed, or whatever) and ask questions about those areas?
I would argue those are specialized areas, not base knowledge. Moreover, what kind of questions are you asking? It is more likely that you have a misaligned assessment of an undergraduates knowledge.
That's with an undergraduate degree. Some programs don't cover these things, or they are optional.
Personally, I definitely lacked knowledge of a few things when I graduated from undergrad (2008)
* Source Control. This wasn't as commonplace back in 2008.
* Linear Algebra. This wasn't a required class, but I took it after I graduated and proved to be invaluable.
* Concurrency. I ended up learning this myself, since this was only very lightly touched on.
No one is going to be familiar with everything when they're fresh out of undergrad. But usually you have at least some specialty (mine was 3D graphics at the time).
Some of the best I've had were from math, physics, philosophy, EE, & drama.
Not only do students come into university (and sometimes even into CS) not knowing what a file system is, many of them have a total lack of interest in learning what is perceived by them to be pointless.
I'd argue it is going to be pretty difficult to engage with any of those foundational topics if you aren't willing to engage with the basic metaphor of most operating systems, files and directories.
None of the courses you listed I would expect of all CS students in an undergraduate degree, and quite frankly, databases is something I would explicitly expect few CS students to have taken (the only branch I'd expect to be less popular to take would be specialization into numerical modelling, although that's more because I expect the people taking such courses to be science majors and not CS majors).
Honest question here: what's the location and comp like?
Keep in mind some students will end up with 3 internships during their undergrads, and many will end up interning twice at the same place. Why should they jump ship to your company?
I recall a story someone told me a while ago. Software business that did local CoL/prevailing wages. Hired an intern one summer that was just running around in circles around the other, more senior devs. Useless to say they loved him and the next summer they tried to get him back, even offering a signing bonus for an internship (something they considered unheard of) but he was already at a large search engine company down in the Bay. You can guess the comp was probably already 3x what his previous job was offering. Of course, he wouldn't return.
There's a whole class of engineers were completely invisible to most companies, even if they are in the same "local market" [0][1] (Some use the term "dark matter devs" but I know it has another meaning [2]). These guys tend to fly under the radar quite a bit. If you are in a tier 2 market or company, your chances of attracting one are close to nil. Because they are extremely valuable, they don't interview a lot and tend to hop between companies where they know people (or get fast tracked internally).
FAANG companies have internship pipelines, with bonus for returning interns. These guys are off the market years before they even graduate.
[0] https://blog.pragmaticengineer.com/software-engineering-sala...
[1] http://danluu.com/bimodal-compensation/
[2] https://www.hanselman.com/blog/dark-matter-developers-the-un...
Computer Science is about, you know, science. Not craft or engineering. Typical programmer's work is craft.
You will not hire metallurgist, who trained to develop new sorts of steels, to work in machinist shop.
Yes, big corporations could have research departments, where computer scientists are needed, every big player has one. Most of programmers work is not.
It is problem in our craft: requirement for university degree for simplest positions. It is wrong. You try to hire people who are overqualified in one areas (which is not needed for this positions anyway!) and underqualified in needed skills. Because they were trained for OTHER work!
What you really get out of CS education is not a lot of crystalized knowledge but an awareness of the shape of the literature. I used to joke that I could get through any interview with "look it up in the hashtable" and "look it up in the literature". If you're aware that something exists in the literature and know how to look it up you can get it done. For instance I wouldn't trust anybody (including myself or ChatGPT) to code up a binary search correctly out of memory, but I would look it up in a (recent) algorithms book.
What's important is how they think and how much knowledge they have. No one cares if you use the terminal or you're still using windows for programming.
Lot of the course work is around mathematics, data, machine learning, compilers and that is something that got them excited too.
A take home test really works well in this setup as they can research on these concepts and try to solve it if they are really motivated. However this does not scale.
We test them on how fast they can learning things, how motivated a candidate is, how driven and ambitious the candidate is. If the grad is really good, these concepts can be picked up pretty fast from the peers.
Likewise, I could have taken a database class but I wasn't interested.
I did take the programming class, though. Theorems, proofs, pen, paper.
Obviously, my university's CS department came out of the math tradition of CS, not the engineering tradition. That doesn't mean I didn't take a hardware design class, though.
People forget the memorization after taking the final exam, and it takes some dedicated interest to take that many similar courses in a row.
So basically it's the old problem of juniors that in order to have a chance to get experience they need to have experience. If I was in a position of hiring graduates, I would focus on how well they master the true fundamentals, and let them pick up the practical details in the job.
There are also a decent amount of electives though, I have a strong networking and IP knowledge because I picked networking courses in my final year
In a lot of cases those areas are also covered solely by electives, so unless the student was lucky enough to take those classes, or they happen to spend their free time reading about these topics, they won't know them.
heck, even a decent-sized side-project would quickly expose one to DBS intricacies , computing tasks in parallel or across networks of computers.
I don't see how a more stringent CS curriculums would help here - the market is often ahead of academia in this respect (computing).
But generally, since moving to Europe, I see that a lot of new grads don’t have that foundational knowledge as you mention. I’ve even had other colleagues complain that university recruiting was terrible because people had zero idea about things like operating systems. My feeling is that unis started targeting skills that were required in the job market instead of foundational knowledge.
In contrast, my uni was very “systems” focused and didn’t really focus on skills employers were looking for at the time.
I'd expect that if a CS student has a passion for what they do they will try to learn topics beyond the curriculum. Doing the bare minimum has a name: mediocrity. Maybe that is what you are seeing.
Other comments mention that students come unprepared and thus introductory courses need to be more/longer. I think that too is true.
There's a bell curve to aptitude and drive to learn. The curriculum at any university reflects the intersection of that median in the student body with a social narrative.
OP – what do you think this dynamic you have noticed would have to do with the folks picking their classes? Could it be more related to, say, your notion of what constitutes topical knowledge, or indeed, how to measure it?
What are you interviewing for? Examples of your dialogue might help but the power of reasoning from first principles, ELI5 comprehension, window management continues to increase over time while the value of mastering specific content remains very context dependent.
There is usually a gap of 1-3 years for university curriculum to be approved to update. Some of the above topics above might be just one course, or one chapter of one course, or a few pages of a chapter.
As you know, in that time things can completely change in both established and new areas of knowledge.
Learning how to learn is something that's the most important for a CS grad to learn from themselves.
Comp sci students don't typically deal with a lot of real data, and the requirements around data that shape how it's structured in the real world. They definitely do not deal much with scaling dbs, or concurrent access, or anything you might find in the typical distributed systems that make up IRL.
> Concurrent Programming
Same for the same reasons above. They usually just have to do some simple things with simple systems on a local level.
> Network Programming
Same again.
Frankly you can't have it all.
What I see successful companies do is invest in uni programs so they have a stake on what the students learn. Take for example the UT Inventors Program.
This has been a problem for a while. Hopefully this provides some extra context.
So in general you cannot expect the typical college senior to know any particular one of these advanced topics. The baseline is still data structures and algorithms.
Yes. For example, here's MIT's computer science and engineering degree requirements. After math and basic programming and engineering fundamentals, it's all electives.
http://catalog.mit.edu/degree-charts/computer-science-engine...
Day one, we started with Gang of Four, the 25 class design patterns, and what an MVC is. Then we are leading into SOLID, and I’ll push him into mastering Dependency Injection and Unit testing.
I feel like his college professor or something lol, and I’m still so surprised that programmers aren’t taught the ABC’s in school.
Rather, none of those things are in fact foundational to the field of "Computer Science" -- it's not a programming/software engineering program or apprenticeship. For better or worse (and you clearly think worse!).
OP, I'm guessing you do not have a CS degree?
Most developers probably do not need data structures, OS, compiler type courses, but instead would benefit from higher level, engineering type courses that reflect modern application design and development.
Note: I am 3 decades out of school, and there have been some moves in this direction I can see from interviewing juniors. But not enough.
I encountered the same lack of foundational computer science knowledge, but -here 's the twist- mostly from American students.
The best candidates I ever interviewed came from (in no particular order) : the Technical University of Munich, 42/Epita/Epitech, and the École Polytechnique Fédérale de Lausanne.
Needless to say I didn't get that job. I still do poorly at interviews, but I have gotten a little better.
Don't hire people based off programming riddles.
Don't hire people because they know an SQL join - hire people who can come up to speed with SQL in a reasonable amount of time.
Industry has a responsibility to teach good coding practices. That's just the way it is.
primarily, it is your expectation that advanced low-level knowledge of these topics is "fundamental" to being a modern programmer. today, you can get along just fine without knowing them.
additionally, most cs graduates are simply taking the class as their major, and it is not their great underlying passion or hobby. where you or i spend our spare time reading about how chess.com balances load or the changelog of the new postgres point release, most students will simply be doing something else.
i took a few introductory cs classes at my university. we were taught to use IDEs and were only given a cursory explanation of the shell. i would consider proficient use of a shell to be foundational knowledge, but in an era where all your files exist within a gui and the only shell you ever need to touch pops up at the bottom of vscode, it can be largely avoided.
most students don't feel the need to go beyond the bare minimum to graduate, so they don't.
probably. although these were all topics that i taught myself through books and practice anyhow.
there are great resources for learning these topics these days, many of which are even free.
maybe try asking them how they focused their studies and try to suss out if they're willing to fill any gaps they may have.
Also, it is a very good example of Goodhart's Law: https://en.m.wikipedia.org/wiki/Goodhart%27s_law
Doing a post final semester internship sounds kind of weird to me, so I would suspect you are European?
There is somewhat of a divide between computer science and engineering. Computer science tends to mean all math and algorithms. Engineering is closer to what you listed.
Why would you expect interns to have deep understanding of these topics? What required courses would they have learned about these topics in?
Why don't you already know not to expect this knowledge from candidates?
We were taught SQL heavily.
Network programming like packet switching, etc was something we didn't get into until grad school though.
I was being paid to write code for over 15 years before I needed to interact with a database or know anything about a network. What you would call foundational knowledge others would call niche.
And because some people say they did CS, while in fact they did some related study. No experience = no knowledge.
It's not just people who just finished their study.
I have been thinking the same thing. I am training an CS Student in computer operation fundamentals and it appears that collages do not teach practicalities of computing. I can only assume it's for cost/time reasons.
I've seen postings even require you have authored OSS and you provide a repo link. I guess you end up with fewer candidates or none.
concurrent programming for example also not taught same always. myself in school learned about concurrency like threads and mutex theory but “really” learned first time on job.
I do not think it is reasonable at this point to expect a candidate to have basic knowledge about _all_ the areas of CS.
- You can write a basic program with loop/branch/etc. structure. Nice if you know a bit about recursion (fibo), Nicer if done side projects of any kind;
- You are eager to learn
MIT has an extra online class teaching their CS grads the basics of software development.
Look for competency, not knowledge
The answer is obvious.
A "typical" CS student cannot care about all this and also jump hoops to land a job.
For example in my current programming job I don’t do anything involving networking or databases
Garbage In, Garbage Out.
The motivations of the average freshman picking CS major has changed over time.
Because a lot of companies have moved on to NoSQL databases and key value stores.
> * Concurrent Programming
Because many languages use lightweight threads (fibers, coroutines, etc.) now that don't involve context switching.
> * Network Programming
Because these days barely any company does anything beyond web requests and nobody implements these from scratch. No DNS lookups, no ACKs, no manual buffer writing and reading, no marshalling, etc.
I doubt that this situation has changed much in the intervening time, if anything it's likely gotten significantly worse. So, it stands to reason that most CS graduates actually don't care about programming, computers, computer science, or anything related, but see this as a pathway to get ahead in life via a decent paying office job that has a high probability of remote work in their lifetime. There were 3 people (including myself) out of around 450 people in the CS program I attended that seemed to actually care about knowing how computers really worked, and all three of us went on to have pretty lucrative careers, I have no idea how the other 347 people ended up, because I didn't associate more than required with folks who thought college was mostly about getting black out drunk and still getting the paper at the end of the experience.
At least for me, my observation has been that the folks who actually cared all did quite well, and these were often the folks who actually put in the effort to do internships and co-ops or contributed to open source software during their time in college. If you are concerned with the skills of intern candidates, I'd suggest either things have degraded far beyond expectations from a few decades ago, or alternatively they have strong skills in areas of CS that you aren't looking at. I know a lot more about at least one topic than anyone I meet, and I presume anyone I meet knows a lot more about at least one topic than I do. CS has become a much broader field in the time since I was in college, and it's very reasonable to believe that students now who care deeply about CS also care about different things than I did. I spent a lot of time messing with network programming and getting interested in information security, these days I imagine someone might be more interested in things like optimizing web-assembly or building cross-platform applications using react-native, and learning these more "front-end" things deeply, vs focusing on systems programming concepts. For many people, the browser basically is their operating system.
Databases didn't fit into my schedule because I was also studying philosophy and mathematics, and I focused my electives on areas where there was a lot of overlap between at least two of those subjects: graph theory, abstract algebra, formal logic, formal semantics, philosophy of language, philosophy of science, compilers, formal languages & automata, programming paradigms (comparative survey of functional, logic, and object-oriented programming), etc.
I don't think network programming and concurrent programming were even offered at my school, but I did implement networked multiplayer on a TRPG that I implemented in my OOP class, one of the few CS courses I had where we had relatively long group projects assigned. I have no idea if it was any good from a network programming perspective, but that part of our game worked reliably.
I think those topics you mention would be more central to the curriculum of a software engineering degree (which was not offered at my school).
Allocate time to teach others around you.
The foundations are things like (this list should not be read as being comprehensive):
* Formalized languages
* The implementation, all the way down to the hardware, underneath those languages and how to bridge between them (this is far too frequently omitted unless it comes up in some 1 hour credit C course - and lately we're seeing C replaced with C++ courses trying to pretend to be abstract, making them pretty pointless to teach at all)
* Algorithms
* Complexity, Big-O notation or equivalent, computatibility in general, and mapping underlying algorithms to different problems
* The reality of just how different an implemented language and platform is from the abstract idea of one - limitations on sizes, errors, failures, etc, all the things that complicate the lives of a theorist trying to do real work
I'm especially annoyed by the C++ classes - said language is a massive cognitive load to inflict on students, and a huge, vocational distraction from the theory and concepts a degree SHOULD be teaching. A better course spread would be some machine language, C, LISP, Smalltalk, some (modern version of a) goal oriented language like Prolog or KL/1, something with intrinsic multiprocessing support, and one that is essentially distributed, and so on. Languages that demonstrate the breadth of what a language can encompass, rather than grinding students into the bottomless pit that C++ has become.
I do agree that these all are relevant: databases (by implementing one, with attention to ACID, but with a lot of assumptions about reliability given and highlighted), concurrency (both with a language that does it intrinsically, and in one that doesn't), and network programming (at least three totally different approaches here: intrinsic to a cluster environment, intrinsic to the language, and implemented via libraries like in C). However, these ideas are not each important enough to count as core.
The point of a non-vocational, classical degree is to be able to understand the field and to be able to create tools - including new ones. The higher the degree, the more important it becomes to be able to extend the field. The objectives (in part) of a classical degree are what I've described, with the goal of producing synthesists and creators within the field of computer science.
In the vocational education, the grads will hit the ground ready to write code using existing libraries and tools, perfect to drop into some project underway to use what they've learned to tie everything together. But they'll be pretty naïve when it comes to creating them, and generally unaware of ideas that fit the tasks better and just need to be pulled in. Have them learn whatever language is the current fad for a couple of years, and train them in all the current things. But they'll have a harder time as the tools shift underfoot.
Some things bridge both the classic and the vocational. Source code control, for one (UTexas CS basically requires knowing git, for example), all the tools' varieties we take for granted to share work, google, communicate, and try to make AI write our homework or job assignments. But the classes for git should be quite different in a vocational versus classical curriculum.
Basically, given what I've heard from former students at various places (and I taught for a decade myself) I see many colleges leaning towards teaching a vocational CS curriculum and pretending it's classical, and this damages the field overall. At the same time, I've seen overly theoretical degrees in CS that I think are also a problem, in a different way, if the students were led to believe they'd be able to actually create software when they were done.
The most pathological example I've seen is a professor at the University of Texas who was trying to teach his students IPC using C and Unix as the demonstration environment (essentially a classical lesson). However, since the professor's awareness of the implementation was too limited, he was using examples with the wrong paradigm - FIFO pipes - instead of sockets. The result being that the examples only worked for processes with a shared parent proc. This undercut the objective so badly that the students were missing the point of IPC, since they could have had a single process produce essentially the same results as the example, and were getting no payoff from the FIFO aspect. The professor's limited vocational grounding was producing students who were failing to understand both the practical AND theoretical.
So it's not just a case of a poor curriculum poorly serving the students, the professors themselves are suffering from the problem of being too polarized between theory and practice. The problem needs to be viewed as endemic in some colleges and I'm not sure anyone's really talking about it enough.
The three bodies of knowledge you list (DB's, Concurrent and Network programming) are not mainstream at all. Someone can go through an entire career in software engineering and never touch any of these areas except for, perhaps, superficially through libraries, etc.
Is, it, then, fair to judge recent CS grads based on a test of these skills?
Again, while I agree with the general sentiment that the average CS graduate has serious gaps in knowledge, skills and experience, I have to temper my thinking.
I am not sure it is fair to use such skills tests as a metric any more than giving someone 60 minutes to complete 60 Calculus problems is a measure of their understanding of the topic, their ability to apply it and learn what they don't know.
That last part, to me, is the most important thing I try to learn about someone in an interview. I could not care less what they know today. The basics have to be there, of course. Past that, I need to understand how their mind works, if they are adaptable and what their approach to learning and applying something they don't know looks like.
Some examples of the stunts I will pull:
- Implement something in LISP, Forth or APL, knowing they don't know the languages. I want to see how they react and solve that problem. No, not the code challenge, the matter of being asked to do something they don't have a clue how to approach.
- Write an FPGA module in VHDL when they have never used anything other than Verilog.
- Design a multi-failure tolerant circuit when they have never done such a thing
- Explain how to design an electrical DDR4 interface (again, knowing they have never done it).
- Expanding on that, explain how to design an SDRAM memory controller from scratch
Etc.
This isn't at all about looking for the correct or perfect answers. In my 40 years in engineering I can probably say 75% to 90% of what I have worked on has had an element of "How the hell do I do this?". You want people who are able to deal with that, adapt and deliver. Engineering is constantly evolving, what someone learned in school, at some point, becomes irrelevant.
When I went to engineering school FPGA's, the internet and digital circuits running in the GHz range did not exist. I had to learn all of that, and more, as life and career progressed.
I think the right paradigm and metric is to evaluate the person rather than the contents of the mental database they happen to have stored at that point in their journey.
It's like asking why someone with a degree in theoretical physics cannot fix my car.
The real question is why so many people with CS degrees don't know what CS is really about.
your choice.