HACKER Q&A
📣 nikolasn

What will stay the same in the next 50 years?


As a senior CS student, I am interested in looking for ways to improve my professional skills and develop habits that will ultimately pay off long-term in my career.

From browsing internet forums and listening to friends it seems that a popular approach to bettering oneself as a software engineer is to "learn the latest technologies". However, with so many new technologies to choose from, how will I know which ones will benefit me long term? This reminds me of the Red Queen hypothesis [0], learning new stuff just to keep up.

I suppose the stereotypical example is the web development space with its many frameworks (Reactjs, Nextjs, Svelte, Remix). I notice this as well with programming languages such as Go, Rust, Julia, Dart, and Kotlin.

On the flip side, I wonder if mastering the fundamentals and the things that will not change in the next 50 years is a wiser approach. This is inspired by the ideas of Jeff Bezos and Warren Buffett [1, 2]. If this is true in technology, then I wonder what are those things that will not change in the next 50 years?

I also wonder how the Lindy Effect [3] applies to technology. Would it be more worthwhile to strive for a high level of proficiency in decades-old languages such as Java and C++, or focus instead on promising languages such as Go and Rust? Reading Dan McKinley's article on Choose Boring Technology [4] nudges me in the direction of focusing on more mature technologies.

[0] https://en.wikipedia.org/wiki/Red_Queen_hypothesis

[1] https://www.goodreads.com/quotes/966699-i-very-frequently-get-the-question-what-s-going-to-change

[2] https://fs.blog/staying-the-same/

[3] https://en.wikipedia.org/wiki/Lindy_effect

[4] https://mcfunley.com/choose-boring-technology


  👤 vmurthy Accepted Answer ✓
- The scientific method. It has served us well broadly speaking over the last 3 centuries

- Humans oscillating between greed and fear. Has happened for the last few millennia :-)

- Showing up on time , being reliable and doing your best leading to good outcomes

I usually base my decisions around these 3..


👤 jstx1
50 years is a long time. If you're looking for the answer to "what should I learn now so I don't need to learn anything else in the next 50 years?" then you're approaching the whole thing in the wrong way.

There's no conflict between learning new things and "fundamentals" (however we define them). It's useful to know both. You start with the more general stuff that applies everywhere (this is a lot of what you study at university - how a computer works, how code executes, how to design a program, data structures and algorithms etc.) and then over time you pick up details like language details, specific APIs, frameworks etc.

Languages don't matter that much. Learn a few, be aware of what's out there and get comfortable with picking up new ones as you need them. It's really not a big deal.


👤 lido

  Specific technologies that won't change:
    - Unix. It's been in use since 1969, runs on almost every device. It isn't going anywhere.
      - In particular, become fluent with its ideas and the CLI. That group at bell labs was hands down the best programmers doing the best programming.
    - LISP. It's been in less use since 1958 and influences almost every modern language. It's like learning greek classics because of their influence.
    - HTTP. The basics of the internet haven't changed and work pretty well.
    - Text. Text is the best way to convey information. Nothing beats it. Not pictures. Not audio. Not video. Learn text manipulation.

👤 marttt
Regarding the importance of communication vs "hard skills", I often think of a quote by Nassim Taleb:

"You cannot possibly trust two classes of people: educators who are better at explaining than understanding because they’re selectively better, and science journalists, who are better at communicating than understanding. Then, you end up with things like scientism." [1]

The pandemic and climate crisis has also shown us the huge importance of thoughtful communication. Having both theoretical and years of practical background in journalism, I do wonder, though, how much of this emphasis is slowly becoming a hype; a source of well-paid bullshit jobs. There are companies and institutions with deliberately honest communication out there, but often times "good communication" leans towards masterful, smooth hiding of wrongdoings or stuff you don't actually grasp. It's way too easy to (accidentally) become better at explaining than understanding in this field. In this light, it's probably always good to have those more straightforward "hard skills people" around you as well.

All in all, I think it's an excellent time to read Dostoevsky and think about the volatility of human temperament. We're often trying to hide our rough edges with (soft) communication, but who knows how well, or for how long, this actually works.

I would also definitely bet on Dostoevsky's fiction being relevant in 50 years, too.

1: https://medium.com/conversations-with-tyler/bryan-caplan-nas...


👤 dfcowell
Technology definitely will not. Fundamental knowledge may be resilient in the face of change, but the skills you need to cultivate to navigate that change and operate effectively in society and business are evergreen.

Learn how to reason about problems from first principles. Understand how back planning works and how to apply it to a project. Cultivate your communication skills and build a habit of constant learning.

Learn about critical thinking and logical fallacies. Build heuristics you can use to evaluate ideas with incomplete information.

These skills will serve you well long after any technical knowledge you have expires.


👤 dataflow
Are you asking about technologies or are you asking about knowledge?

New technology often replaces old technology, but new theory (in math/CS) generally extends older theory, and problem-solving techniques are often timeless. So if you're looking to learn things that will be relevant in the long-term, your answer is likely algorithms, data structures, and the like. They're not going to be rendered irrelevant by the passage of time; they're going to become more fundamental as we build more advanced theories on top of them.

If the kind of stuff you're into is more like Rust, React, or whatever, then the stuff useful 50+ years from now probably won't be those particular technologies, but the ideas behind them. So instead of learning Rust, you'd need to learn about topics like abstract interpretation and formal verification. (Or to put it another way: try to figure out how to make a Rust compiler, vs. learning how to use it.) If you're into event-based technologies or like parallel programming, go learn about the various models of computation (there are lots) and their strengths and weaknesses. There's a lot of prior literature and research in areas you might not expect, and you'll be surprised how often research from half a century ago suddenly becomes relevant again (like it did with parallel computing). If you like PyTorch, go learn about optimization and machine learning (and I'm not referring to neural network architectures here). There are grad courses on all of this stuff if you go searching around. You'll probably have to put in more effort into them than you'd need to put into learning React or Go, but that's life.


👤 kashyapc
Plain text.

It lets us convey information in the most effective and durable way, with very high signal-to-noise ratio. (NB: I'm not dissing on other communication mediums; I see them as complementary to plain text.)


👤 xenihn
I think 50 years is way too long to plan ahead for. A ~9-12 year approach makes more sense. Look at these intervals:

1942 - First mass-produced fully-electric analog computer

1951 - First mass-produced commercial computer (using vacuum tubes)

1963 - First mass-produced integrated circuit mainframe computer

1974 - First mass-produced personal computer

1984 - First GUI operating system with widespread adoption

1995 - AOL, CompuServe, and Prodigy bring the internet to the masses

2007 - First smartphone with widespread adoption

~2017 - I'm actually having trouble coming up with something. Maybe cryptocurrency (specifically Ethereum) gaining widespread adoption, but it's too early to judge its significance. Same with VR.

The job market and what it meant to be a programmer changed so much between these intervals.

You might not even be in tech anymore in 11 years. Maybe you'll be retired after a lucky windfall. Or dead.


👤 ubavic
Mathematics.

Of course, mathematics will continue to develop in unimaginable directions, but the “base” notions in math will stay the same. I am talking here about notions of relation, function, group, vector space, tensor, category, measure, integral... Even if some of those are known for a long time, formal definitions that we use today are pretty new. And it seems to me that those will stay fundamental for a long time.

(Seems to me that other sciences (like physics and biology) don’t have that “stability”. In those fields, it wouldn't be too surprising if we discovered some fact that changes a lot we know about it)

Also, as these core notions of math will stay the same, math notation will stay the same also. And that is a good sign that we will continue to use TeX for a loooong time


👤 temikus
One of the constants I've seen in the past 15 years is soft skills and leadership - learn how to work in a team, how to socialize ideas, lead without authority, do cross-functional work.

This will make you stand out from your peers and ensure a good career path for years to come.


👤 bsenftner
Study math (linear algebra, geometry, calculus...) and statistics, in the pure form, then again realized in computer languages and visualized for communication in 2D and 3D with computer graphics, animations and compressed streams. And then again as realized in deep learning, how computer vision applies calculus to imagery, and how statistical reasoning and deep learning are siblings. This logical progression of math to building automated reasoning applications is eternal and will serve an entire career.

👤 ha1zum
I would simplify the problem as choosing between a boring but proven approach to things and a more innovative and exciting one that has a bit more risk. Like most things in life, I would try to balance it. Because I don't have the kind of stress resistance and energy that's required to go 100% on the more risky stuff all the time, but at the same time living 100% on the other side could be unbearably boring after some time.

At my day job I work as a Java+Angular developer on an enterprise software, but if there's a chance to do a small freelance gig on the side, or my personal projects, I'd take it as an opportunity to try out the "hip" stuff just to learn what's good about it and if I like it or not (most recently Nuxt.js).

Also the older, more stable tech are still getting cool updates too but at a much more slower rate, and very often adopted from the newer languages/frameworks. So if I've played with the newer stuff too, I would know what the update is about and have a much easier time getting used to it.


👤 btschaegg
If I may flip your question around and ignore the technical things you seem to be after:

What certainly won't change is the way humans work. That means that the way they use tools also won't change much (modulo maybe some newly established conventions).

As a direct result, the way Don Norman simplifies the process in figure 2.7[1] of "The Design of Everyday Things" will stay relevant. If you intend to build tools for humans -- even if those tools are digital -- it will be a good idea to keep the principles of feedback and feedforward in mind. If you want to learn timeless skills, it might be a good idea to learn more about your audience (i.e. users) instead of new and shiny technologies.

[1]: https://books.google.ch/books?id=I1o4DgAAQBAJ&pg=PT76&dq=des...


👤 kjellsbells
Over long horizons, only very structural knowledge retains value. This is why CS courses drag you thru exercises like writing your own compiler and simulating a CPU. This isnt an observation peculiar to computing of course. The practice of biology/genetics are radically different from what they were even in the late 1990s... but the fundamentals are the same.

I coach all my gen Z hires that the skill is not in coding but communication. For example, the act of coding is a dual communication. Once to the computer, where "skill" means transmitting the message simply, unambiguously and without repetition. once again, repeatedly, to the community of readers and maintainers who look at your code (where the skill here is to communicate clearly and without misunderstanding, without needing a shared cultural, organizational or temporal context). Being able to think deeply about what it means to communicate is a true long term life skill.


👤 ctdonath
Seek to find a thread thru the full stack, sand to simulation. Get at least a cursory grasp of silicon-based transistors, binary representations of data, basic binary logic, 4-operation CPUs, simple memory & storage, a toy operating system, C programming, object-oriented C++/Swift/Java programming, memory-mapped I/O & graphics, TCP/IP networking, HTTPS protocols, then pick a major example vertical (PC, Mac, iPhone, Android) and work up to a publishable (if simple) product.

Having seen computing grow from before the IBM PC, my primary lament is new developers rarely have a view of how everything works (however simply) from the very bottom to top, from basic semiconductor electronics to getting something onto an App Store. Most pick a layer and live there, largely unaware of what's happening below or above - and so don't know why things work, and more importantly why they don't.


👤 notjustanymike
The only constant I've seen in 20 years is terrible documentation, which highlights the most common mistake all engineers make: we hyperfocus on the how, and forget to consider the why.

Learn how to explain and document why you chose an architecture for a problem. It's a habit that'll set you apart from your peers.


👤 beaconstudios
Theory and abstract principles - databases and client/server architecture, data structures and algorithms, distributed design and the two generals problem (also extrapolated as CAP theorem), race conditions and locking, queue theory and cybernetics, etc etc.

👤 aronpye
We’ll always need plumbers, electricians, builders, mechanics. At least until general AI becomes a thing. A lot of jobs will be eroded by automation except those above in the mean time.

I don’t foresee AI becoming advanced enough to be able to plan out and build most residential homes in the next 30 or so years. Maybe after that it’ll be good enough to design basic / common homes. Same goes for the hardware and control systems needed for fine motor control for those jobs.

I also don’t see AI replacing most software engineers or STEM based jobs until general AI becomes a thing either. I can see testing probably being the first thing to go.

Fundamental physics will always be fundamental ;) and you can build up to most things from first principles.


👤 bebna
For always useful basics I recommend the book clean coder, not code, coder, as in worker. It as also helpful to non programmers.

The book talks about what is important in the communication in the workplace, like when, why and how saying no for example.


👤 dpeck
The consistent, relentless, and joy sucking “professionalization” of everything involving computing.

We’ll do less with more, and then have meetings about it.

If you’re able to do things end to end and actually understand how things work people will think you’re a witch.


👤 throwawaylinux
Fundamentals about how a CPU works, machine code, caches, memory, etc. have not changed for 50 years. Well, they have and they haven't I guess. But high frequency highly pipelined speculative out of order CPUs with several levels of cache and much slower memory have been pretty "unchanging" for almost 30 years. That is to say they've also changed a lot, but fundamentally extremely recognizable and many of the major basic techniques for extracting good performance are the same. Multiprocessing and SIMD/vector processing are significantly more common now, but they were around back then too.

👤 quickthrower2
I never did this but I think for maximum success move to the best city in the world for your career. I think even in 50 years this will be true. Will it be San Fransisco? Probably but maybe not. But the advice stands.

👤 rebelos
You can't predict how things will evolve in 50 years but, as you have suggested, there are various (often simple) ideas and concepts that are timelessly useful. So develop a personal system for storing knowledge, keep adding to it, and use it to review continuously.

Here's a counterintuitive idea you won't hear often: reading on its own is futile. You'll eventually forget almost all of what you read or even learn if it's not reinforced. In fact, knowledge that isn't being actively recalled or applied can decay shockingly quickly.

The only way to combat this is to have a system. A simple and fairly informal system might be Post-Its and notes in a book that you review sporadically, whenever you happen to remember to do it. A more sophisticated system might leverage spaced repetition, practice problems, and tools like e.g. Anki and Obsidian/Roam Research.

As an example, let's say you didn't want to forget how to differentiate (most) functions. This might sound ridiculous from where you are now, but keep in mind this is about thinking longer term. So how would you achieve this? Well you might have some flash cards with concepts and some graphics and then some more with practice problems. How many problems would you need to do a year to recall differentiation properly? Maybe 50? Add flash cards.

Now differentiation is a poor example insofar as you might never find practical utility for it across the entirety of a software engineering career and you don't necessarily even need to remember how to do it manually (e.g. just use Wolfram Alpha). It can suffice to spend 5-10 minutes a year making sure you recall the most critical ideas.

Broaden out and think about what could be useful for you. Mental models, data structures/algorithms, systems design, etc. Consider what would be needed to retain that knowledge and then extend your system to make it happen. Focus on ideas that are disproportionately useful (or could be) but that you encounter infrequently (so e.g. perhaps you might include 'distributed systems consensus' and not 'git fundamentals'). For most concepts it suffices to have a good high level understanding and assume that you can deep dive and relearn the rest, should the need arise.

You'll be amazed at the surface area you can cover spending even 5 hours a week doing this. It's one of the easiest ways to remain an effective and flexible thinker.


👤 silisili
Most languages, especially those you listed, are rather similar. You learn how to program in one(really program, not make something compile from hail marys) , and switching to another isn't that difficult. New libraries, different names, sometimes a new paradigm or two but nothing really groundbreaking.

Unless functional programming takes over the world, then a lot of us are screwed.


👤 julianlam
> On the flip side, I wonder if mastering the fundamentals and the things that will not change in the next 50 years is a wiser approach.

This will never go out of style. All these new frameworks are all built on javascript or such technologies, and knowing the underlying technology will always pay for itself in dividends.

Learning a framework-du-jour will only let you be proficient in that framework


👤 tomcam
Listening, speaking, and and writing well


👤 randtrain34
Data structures, algorithms, fundamentals like how compilers/memory/operating systems work likely won't change, programming languages and frameworks are constantly changing.

👤 dmitrygr
C: No matter what the newest hippest toy will be, it will be running on a large layer of C code.

COBOL: No matter what where how, the airlines and banks will still be running on COBOL.


👤 J_cst
Funny enough, just yesterday I was thinking that cloth hangers will stay the same forever... Don't know why

👤 jl2718
In 50 years, if you do really well, everything will be different except the life partner you are choosing now.

👤 joeman1000
The core of emacs. It will have new features, but emacs skill will never become obsolete.

👤 hbcondo714
Fifty years from now you'll be very dead. Your entire generation will fuck this planet into a coma[1]

[1] https://m.imdb.com/title/tt5463162/quotes/


👤 hprotagonist
emacs, ffmpeg, imagemagick, ...

👤 sys_64738
The GIL.

👤 launchiterate
Communication skills.

👤 i0n1
the fact that most of us will probably be dead :)

👤 alphabet9000
1366x768

👤 zheng_qm
The way people chew gum

👤 heybecker
Nerds will still be awkward