Working as a software engineer for 5 years, I've forgotten all CS stuff
I have a Bachelor of CS. But after working for 5 years, I can't remember these CS stuffs like Computer Organization, OS theories, Database normalization. What occupies my mind is delivering the feature to meet business goals and Kubernetes stuffs.
Is this normal? Or I need to work harder to learn them again?
Normal, yes.
The nice thing about forgotten knowledge is that it's much easier to jog the memory and get it back than to learn it from nothing. You likely still use a lot of the fundamental concepts without realizing it.
I am someone who came from a non-traditional, self-taught path ~25 years ago. I did get some formal schooling in electronics, which you'd think was useless, but I can think of a few times solid digital logic understanding has served me well in the past, for example. On the other side of that, I designed some shitty databases early in my career! Understanding normalization could have saved me a lot of headaches when I was new.
As you progress in your software engineering career, the level of abstraction of the knowledge you use on a daily basis gets higher and higher. The academic CS concepts underpin everything, but rarely do you need to go below a few levels of abstraction to solve problems in the real-world (at least in most non-cutting edge jobs).
For example, to build a web app the level of abstraction your job requires is probably on the level of web frameworks and API's. Now and then you'd need to understand the web protocols. Rarely would you need to write software involving the low level transport mechanisms. Rarely would you inspect packets yourself and perform verification and decoding through pen and paper. Even rarer still would you need to use maths to design the signaling procedures between hardware components. Point being that it's normal to now know off-hand all the lower level CS stuff, but you recognize the patterns, and in the rare occasion you need to actually access all that information you wouldn't be at a complete loss.
In some specializations of programming, you're going to need a lot of those things. For instance, working with game engines, scientific simulations, image or signal processing, finance, or simply making the base software and libraries that other people use, can involve a lot of CS.
In larger corporations, the programming is often much higher level, and consists more of stringing together libraries and frameworks and entire systems so that they fulfill a business purpose. Even simple programs can take hundreds of megabytes of memory and have tens or hundreds of dependencies beyond anyone's control.
If you want to keep practicing your algorithm skills, you might try something like https://projecteuler.net/ , which is very mathy, or https://checkio.org/ , which is a bit more user-friendly, and get some practice there. As for OS theory, there are always open-source operating systems one can contribute to, though I suspect many of them would consume a lot of a person's time.
I find the most useful thing about CS knowledge (and especially algorithms & data structures) is having a hunch for the shape of a problem and knowing what to google for. It can save a lot of thinking and stave off performance problems before they occur. But you’re there to deliver business value and it’s only right that those concerns should dominate. That doesn’t mean you won’t have “this database is completely unmanageable and we can’t add new features fast enough” or “this complicated request is incredibly slow and customers are angry” moments that your knowledge can solve though.
The thing most folks don’t realize is that Computer Science isn’t Software Engineering. They’re related but distinct.
Software Engineering is a lot of the day to day work you do as software engineer. It’s more about people, process, execution. In reality, most problems you’ll encounter in day to day software engineering won’t be CS problems. Lots of CRUD-shaped apps of varying forms and sizes.
Occasionally, you’ll get a real CS shaped problem and those are fun.
As others say - refreshers worked well since you’ve already learned the material. Find some interesting books or ideas to explore and they’ll come back.
It's normal. And the point weren't to always recall those specific pieces of knowledge, it was to get you to understand those concepts once. Once you did, understanding similar concepts will be easier, you will be able to spot more intricate patterns. Your intuition will be better in situations, where you spot or create things seemingly out of nowhere.
I tend to be a somewhat slow learner but once I've learned something, I seem to remember it forever (I only lose a little bit of detail).
I remember most of the stuff from university 10 years ago including the brand and model of the microcontroller I programmed (ATMEL ATMEGA8-16PU) and the names of the I/O registers on that chip, the name of the program I used to program it with (AVR Studio). I also remember most stuff I learned in Discrete Mathematics, pretty much every ADT (Abstract Data Type) I learned about in my Algorithms and Data Structures class and I remember pretty much everything I learned in Machine Learning (decision trees with minimax algorithm, alpha-beta pruning, heuristic functions, Artificial Neural Networks, step functions, sigmoid functions, backpropagation algorithm)... Time and space complexity...
Except for time and space complexity and ADTs, I haven't really used the other stuff.
A broader understanding gives your more fallback positions in the event of some disaster or catastrophic economic collapse, so I'd say:
Put a little bit of your time into learning interesting new skills in even vaguely related fields. A Jack of all trades may be a master of none, but if chaos is the future, adaptability is the key to survival.
It is normal. You may need to relearn what you have completely forgotten.
I try to integrate my CS education in my normal software engineering work. The work is high level, not a lot of opportunity to use theory of computer architecture or operating systems, but I'm still writing programs. This means lessons from algorithms and functional programming are relevant. Organizing code using this theory has not only retained the college lessons but reinforced them. My advice is to use the tools from your education when you can, and they will only become stronger when you do.
Yes, it's normal. What you don't use you lose. When you need it, you'll be surprised how much will come back but don't expect to be as good as when you learned it.
Is it worth relearning? Only when you need it. The big problem comes when you need to change jobs. If the industry has moved on from what you know then you'll have a hard time finding a job. What make sense is to stay aware of the industry trends and keep up to date.
Don't panic! That's how education works. I don't use every tool in my shop every day, but when I need them, I know I can dust off the manual and figure them back out.
College was just your first stepping stone of hopefully a very long and interesting journey. It might have been special to you, but there's no need to take it along.
They’ll come back when you need to use them. My experiences: using compiler theory stuff, state machines, database normalization, algorithms on the job. You start off feeling rusty and forgetful but once you do some reading and roll your sleeves up it comes back to you.
Also I wasn’t a very diligent undergrad and my memory isn’t exceptional :) YMMV
You should invert the question: since EVERYONE forgets all of this CS stuff then perhaps teaching this CS stuff to us programmers is a complete waste of our time in the 1st place?
So it's not YOU who is "wrong": it's the universities who have it backwards and are disconnected from the real world work that programmers do.
This is quite normal. The CS curriculum is more about breadth than depth.
My personal experience has been that this knowledge is useful as a way of knowing “the lay of the land”. My brain knows that a cpu has a limited addressable space due to the length of bits used to represent memory addresses. It has a sense for how bits travel between networks. A rudimentary idea of how computers actually work.
Systems engineering makes you think about these things a lot more. e.g. I had to go back and relearn most of my networking knowledge to really grok DNS and firewalls in order to debug application connectivity issues. When you dig into them it feels like revisiting a garden you have been to earlier and you are excited to know how its changed over the years and whether there’s any new flowers in there.
You might not have it at the front of your mind, but you have probably internalized a lot of it.
When you see a problem that would benefit from application of CS theory you might end up applying it without realizing that is what you are doing.
Your are way better off than a CS graduate who knows all those things but can't solve simple programming tasks.
I've retaken subjects by watching MIT courses on youtube. It shakes the rust off, and it comes back.
Honestly OP, it’s pretty normal. You have gotten used to working at a different level of abstraction. If you’re really concerned, you could crack open some textbooks and jog your memory. But you likely haven’t forgotten, you’re just stuck into a different type of problem.
You’ll be okay and there’s no reason to panic! Panic will just make everything worse. Chill - you have got this and you will be okay.
I studied math and physics, not CS, but the situation is probably similar. I think the best, if not only realistic way to retain what I will call "theoretical" knowledge is to be genuinely interested in those things, possibly to the detriment of the most rapid possible career advancement.
The other thing is that it's normal. As systems get more complicated, the number of interactions between N elements grows as O(N*2). Thus, most "engineers" spend the bulk of their time organizing and arranging things, fitting things together, troubleshooting, dealing with vendors, etc. In some fields, manually operating tools such as CAD becomes more efficient than understanding the algorithms programmed into the tools. This is basically human powered information processing. The work needed to make the things that are fit together, such as algorithms, falls to a small fraction of the workforce, who are not necessarily the most highly rewarded, as their work is not so visible.
That matches my experience. In all honesty, as a programmer working on enterprise software development, I feel more like a mechanic than an engineer who designs machines. To stretch an analogy, I spent time learning all of the ins-and-outs of how a transmission works and how to make one from scratch. But in the real world the transmission was already made by someone decades ago and my job is to maintain it and swap out some parts when necessary.
I suppose this would be an issue if I were working at a startup or doing some greenfield development, but I suspect the majority of software developers/engineers are working on a codebase that is already established. And, the need for those from-scratch fundamentals is low to non-existent and naturally that knowledge atrophies.
Perfectly normal, in a very wide range of fields. A friend of mine was a career Civil Engineer (& Professional Engineer, etc.). He ditched all his old textbooks & such when he retired - because he had neither looked at them in 30+ years, nor ever needed to.
Don't worry. You've got the intuition necessary to ask the right questions and filter through the results to quickly find the solution to whatever problem you're facing in CS when it comes up.
You only hurt yourself by drawing a line between academic knowledge and work knowledge. Very few people are actual geniuses that just retain information. Rather, most people know about things because they use them every day, that's all. Professors and researchers know CS stuff because that's what they need to get a paycheck. You know K8s because that gets you a paycheck. The fact that the former is called "knowledge" and the latter a "skillset" is just a technicality that shouldn't bother you. All knowledge is equal. If you need to know more CS stuff you will learn it no doubt!
I had an interview the other day and they asked a question on Big O notation to which I didn't really know the answer so guessed it. In 10 years of web dev, backend engineering work I've never needed to use it.
I watch videos and read wikipedia to try to keep those things fresh in my mind. Do I still remember the Greek alphabet? What's the formula for (k choose n)? What's the difference between permutation and combination? Can I still write a Turing machine? What about a Turing with multiple tapes?
I have a bunch of those "questions" that I try to always remember.
My YouTube history is a bunch of those types of videos with some best moments clips from my favorite comedy shows in between them.
The most important part of any foundational education like yours in computer science is to solve the problem of unknown unknowns. It isn't necessary to recall all the answers to questions, what is useful is to know what kinds of questions have already known answers (and by corollary, which ones don't), so that you can look them up when needed rather than trying to reinvent the world.
That may be also because you are working in a Software Engineering job as a Computer Scientist. Or even as an Alayst Programmer (Technical profession, not needing a Degree)
These are 3 very different professions that have 'coding' in common. And given the huge demand for "analyst programmers" nowadays, a lot of people who studied the other two, end up working as the latter.
I find it comes back quickly - with a brief review
Just more working memory. If you know which things that you forgot you need when to remember, again it's not harder but just refresh and the next is easier. Sometimes to learn a new conception you have to suspend few others. Harder way you learn with every step back to refresh, that you need to cross-reference.
It just means you've been doing jobs where CS is (apparently) unimportant. If you're ok with that type of job then by all means, continue, it pays the bills. The downside is that it might get mind-numbingly boring in the future. Pushing bits between API calls isn't exactly computer science.
Yes it's normal and, dare I say, a bit of an exaggeration. Academia and industry are different environments.
Totally normal. There will be some situations where you'll encounter those concepts again, like during troubleshooting X, or reading documentation on something along those lines. You may not remember it perfectly but the concept will be familiar.
Because everyone does, most IT people don't even use UML in practice ¯\_(ツ)_/¯
I've heard programming != Comp sci, so perhaps it's not a bad thing.
Academia, at its worst, is a closed system; it should be regarded as a lucky coincidence if any practical ("real world") skills are obtained in the process of earning a BS/MS/PhD.
My unpopular opinion is that you only forget what you have not learned and that you only learn what you do.
The vast majority studies only to pass exams. So your situation is pretty normal.
your brain is an LRU cache. It'll prioritize what you're using daily and what knowledge you need to rely on. It's completely normal to forget stuff, even the fundamentals
totally normal. not a day without googling the fundamental stuff. what matters for me is overtime your skill you find the answers you are looking for is now more accurate and precise.
yet CS stuff is exactly what they ask you for on interviews. I really don't know how interviewers keep up with this or they just read the answers of their own questions
This may also apply to non-CS things you've learned in the past, such us biology, chemistry, physics, math, etc, to a varying degree.
>> I can't remember these CS stuffs
Kind of nomal.
>> my mind is delivering the feature to meet business goals and Kubernetes stuffs.
Definitely not normal.
We have a local byword - because I never knew, I can not remember it now. when I say it to myself, everything falls into place.
You will forget things but if you had your textbooks and flipped through them, would it come back? Have you tried doing that?
Programming is a very broad term and there are many levels at which you can do programming - from as low level as writing a firmware for some device to as high as wirting a VB macro in Excel. All of these are programming and depending on the kind of programming you are doing, different kind of knowledge is useful in those context.
The CS fundamentals you are talking about are things that matter when you are doing specific kind of programming. For ex: OS theory matters when you are writing an OS.
Yeah is part of why certain interview styles are so tough. Hard to remember classes you took years and years ago.
Wait till you hit 10, 15, 20 etc.
The prospects of passing leet code interviews without major time investment are not great .
Its normal, and you'll only need that for the next time you're doing 9 rounds of interviews
Forgetting fundamentals is not normal, at least that is what I thought.
Responses in this post is changing my perception. No wonder, recruiting is hard.
I have found a knowlge of OS, computer organizaton, digital logic, networking, compiler design are crucial. If you are into datascience or ML, math is absolutely essential. I don't know why so many people think forgetting fundamental is normal.
I am going to use Anki to study again. Anyone is using Anki for CS?
They are still in your mind, just locked in a little dusty room.
Welcome to the tail end of the Dunning-Kruger curve.
Everyone experiences this eventually, but not everybody realizes it at first.
Now that you know your capacity I recommend doing an annual round up of the things you still remember and can confidently put in your CV.
Also take note on how job interviews all but ignore this phenomenon.
No, it is NOT normal if you truly mean you remember nothing from your studies. I really hope that is not the case and that you have only forgotten the intricate details (which is fine). The absolute fundamentals (some of which are listed below) should never be forgotten.
1) Computer Architecture vs. Organization, CPU/Execution Units/Caches/Memory, Clock, Pipelining, ILP, Multiprocessor, SoC.
2) OS responsibilities - Resource management(everything), Kernel vs App, Process vs. Thread, System Calls, Timers/Interrupts/Device Drivers.
3) RDBMS Normalization - "The Key, The Whole Key and Nothing else but the Key".
4) Serial vs Parallel vs. Distributed programming.
Hit the books again if the above don't ring a bell.