HACKER Q&A
📣 kaesar14

What are your optimistic takes on the SWE job market post AI?


I think we’ve all heard quite a bit about how LLMs will impact SWE work in the near future, and would like to hear the other side. Please keep it realistic, no saying the job wont change dramatically or otherwise undersell the impact of LLMs.


  👤 wilsonnb3 Accepted Answer ✓
You aren’t going to be able to tell which companies have developers using LLM’s and which don’t, in the same way that you can’t tell who is using intellisense and who is using vim.

It helps you code somewhat more efficiently, like many of the other tools we have, but doesn’t help with the hard parts of software development.


👤 strken
It's not about writing code, it's about engineering software. CAD changed the job of civil and structural engineers dramatically, but maybe not quite as dramatically as the hype suggested it would.

Part of engineering software is reducing boilerplate. You can only rely on LLMs for easily-verifiable or well-understood code. Most easily-verifiable code is boilerplate and most well-understood code is already in a reusable library.


👤 Hermitian909
I was bearish on this till I saw the jump from chatGPT to GPT-4. The latter can writes readable, correct code for 500-1000 line programs. However, barring a revolution AI models are "memory constrained" and will struggle to comprehend million line codebases (which will become more common if AI is writing more code) and are very bad at planning, make changes with an eye toward the future.

With that in mind I think the following is plausible: how we write code will change, efficiency will improve, product skills will become more important, people at the far right of the skill spectrum will see their salaries increase even more.

SWEs are already have insanely high ROIs and the demand for more software is nearly boundless. Even at many large companies products move slower than we want because even top of market engineers can't work that fast and there are limits to how parallelizable the work is. If you can double code output of individual engineers it's better than doubling the size of the org while keeping costs stable.


👤 jameshush
Thousands of engineering jobs have already been lost post 90's... to Squarespace/Wix/Wordpress etc.

For some reason though, even more were created.

I spent a week of time in my first internship creating a marketing page. I made one that works 10x better, looks 10x nicer, and works on mobile (this was back before there were smartphones) in an afternoon with LeadPages.

I think if you're good with people and can get solutions across the finish line you'll always have work.


👤 pharmakom
- Software costs will fall a bit

- More software will be consumed (Jevons paradox)

- Therefore more demand for skilled engineers

- And the LLMs will not be able to build things end to end

- Salaries at the top end will rise, since an AI augmented senior engineer can deliver more value

Far future: no more software engineers, but also no more human work of any kind. Post scarcity society.


👤 f0e4c2f7
I think we'll have a minor downturn related to macro issues (already seeing this) where it will become similar (arguably already has) to 2000 in terms of getting a job. More normalized salaries (100k-150k) and closer to applying for a job in the rest of the economy (might have to interview a lot or move cities).

I think AI will be the exception to this. Going to be a lot of SWE like support roles around AI. MLOps, glue code, etc. AI people really do not enjoy this type of work. And comp for AI work is even crazier than top FAANG jobs used to be. (top salaries being reported in ranges of $5M-$20M/yr)

As we move out of the current macro and the job market in general improves (may take some years for this one) I think we'll still see a lot of LLM enabled developer jobs.

Being a mathematician used to mean doing a lot of calculating. We have calculators to do that for us now but there are still mathematicians.

Being a developer used to mean doing a lot of programming. We have programmers now (GPT-4) but we'll still have developers driving them (in my opinion).

There were similar panics when things like dynamically typed languages came along, or the idea of programming on a computer rather than punch cards. If you've been doing it for a while it seems like with these developments "anyone" will be able to do it. Which, maybe thats true. Maybe "anyone" could do it now even with some study and effort. But so far as I can tell most people still do not want to do software development even if it's LLM enabled. In fact even some people in tech don't want to use LLMs.

I think what we have now are just way better tools. Power tools if you will. We're still carpenters. We just don't have to turn the screwdriver anymore.


👤 RecycledEle
SWEs (SoftWare Engineers) will be much more productive, and new people will be able to come into the field post-GPT-4.

That will open many opportunities for people to do things they always wanted to do, but did not have the time to do before.

This is a golden age in some ways.

Big software companies will have to decide between taking more projects or cutting staff. I think many people expect them to cut staff. I expect them to take on more projects.

Entirely new SWE Management / Project Management methodologies will develop as currently less qualified SWEs learn quickly using AIs and become much more productive using AIs.

I remember legends about companies encouraging engineers to spend a few days a week on their own projects. If I were running a SWE company, I would (1) encourage everyone to share efficiency hacks that involve AI, and (2) encourage everyone to spend a few hours a week on their own projects. In the medium term, I would not lay off people; I would assign them to new projects.

But I'm crazy. I though the Internet was a fad and that we'd all be back to BBSes in 6 months.


👤 maxilevi
My take is that with AI productivity will increase significantly and reduce the price of developing software.

However this will only up the bar as now that software is easier to make more people want higher quality customized solutions, therefore increasing demand even further.

In the end consumers will have better quality software for the same price.


👤 silisili
I'm split. To me, writing code is the easiest part of the job. Modeling data structures, architecting, thinking big picture about business needs now and in the future, etc are the bread and butter. ChatGPT can write code fine. And probably even model small things fine. But it's blind to 'the bigger picture', and will be until it's less constrained.

I don't know yet what that looks like. From my interactions with ChatGPT, it's very good at recalling, but not so good at thinking, if that makes sense.

So yeah, if your job mostly consists of what can be copy and pasted off of stack overflow, I'd start to sweat a bit.


👤 tjr
1. Some industries safety-critical (like mine, aerospace) have a mixture of bureaucracy, process, and protocols that will result in it being much longer before a nondeterministic neural network system takes over the human engineer jobs. I'm not going to prattle on about the details here, but suffice it to say there are substantial regulations about using automated tools to generate and/or test avionics software, and adoption of new technologies is a slow, meticulous matter.

2. Even once we start using LLMs and such for some aspect of safety-critical software development, we're unlikely to use it for everything. E.g., if we let an LLM write code, we probably won't use an LLM to also review and test the code. So humans are still needed there, and they have to be competent enough to usefully review the code.

3. In my own role as a principal engineer, I find myself more interested in the bigger picture of software design and customer satisfaction. I write less code already, delegating to others, and I'm okay with that. The idea of having a tool that could write code for me, even if not 100% of it, leaving me more time to plan new features and products, seems appealing. (Even if I personally can't actually use the tool myself yet, due to (1), the idea in concept sounds good, and can be leveraged in other industries sooner.)

4. I think it really remains to be seen to what extent LLMs will be able to completely take over the software development process. From my own world, if I were to ask an LLM to "write flight management software suitable for a Cessna Citation X", well, I don't expect usable results at this point. I would anticipate that I would have to break the problem down into sufficiently small, well-understood chunks that we probably wouldn't really be eliminating that many humans from the process. There's a big difference, I think, between writing a 1000-line program that is heavily influenced by numerous examples of well-known, well-documented code, and writing a 1,000,000-line program that does things that are more obscure.

5. I hear lots of software developers talk about how awesome LLMs in relation to getting answers from StackOverflow. It sounds to me like some of these folks spend a lot of time snarfing StackOverflow to do their job. I personally have barely ever found the answers to my work problems on StackOverflow. My own first-hand experiences with LLMs so far suggest that they could help me reduce some boring boilerplate code, and help me to discern some poorly-written API documentation, but most of what I work on I just don't see it helping me with so far. I suppose that how much LLMs can replace one's job may depend on to what extent one's job actually is copy-pasting from StackOverflow.


👤 ilaksh
There is no way it is going to be the same job. In my opinion a lot of people are confused about what software engineering actually is and think it's about using "best practices" or the right frameworks or a certain level of code coverage.

But really the most important part of software engineering is the iteration and feedback loops at different levels. It's not that design or other considerations aren't important but without the closed loops you don't know what you are doing is effective. I think that design and code review also goes in the category of feedback loops.

SWE will be about choosing the virtual team of engineers and testers, connecting them and feeding them the right instructions, and connecting that virtual team in closed loops with real world users.

It seems very unlikely that humans will be able to keep up with the AI in software design, architecture, implementation, etc. after the next year or two. Its possible that progress will stop but there is no reason to believe that.


👤 anonuser123456
I am exceptionally optimistic about a future with LLMs. They seem to do really well at fastidiously replicating solutions to problems. What the currently lack, are relevant training data to generalize solutions to problems or the ability to perform higher order generalization.

I find it very easy to solve problems, but tedious to broadly apply solutions across domains. I'm also very sloppy as a programmer, letting my mind wander further into the future problems to the determent of the current task. Having an LLM buddy to codify my thoughts, regularize them and apply them with high precision would make me MUCH more productive.

In the end, it may be that LLMs are simply better programmers than 99.999% of people. But there will always be need for specialists to bridge between the LLM and some other domain and programmers of today will be that bridge.

And if not... then AGI will have eaten us all up to make paper clips anyway.


👤 serjester
Imagine if every businesses had access to teams of Google’s best, perfectly aligned with their goals. This opens up so many possibilities. For the vast majority of businesses, software is a cost center not a core competency. There’s so many projects that are simply too cost prohibitive right now.

Your fundamental assumption is that there’s a finite amount of software work. Had this been the case we’d have seen signs of it plateauing over the last 50 years. Until we have true AGI, LLM’s are just an another tool to increase what humans are capable of. Historically this means humans just ask for more stuff.


👤 spaceman_2020
ChatGPT-like tools are essentially lowering the barrier to entry. That’s great for casual programmers and beginners who want to play in the big leagues. I’m not sure it’s going to replace experienced developers who truly understand algorithms and architecture and design systems, but it’s going to help a lot of people with <1 year of coding experience build better products.

I feel like this is the WordPress/Blogger.com moment for coding. There should be an explosion in the amount of apps and products.


👤 nashashmi
There will be more creative uses of LLM models that software engineers will have to deliver.

As someone else said, web wysiwig editors took a lot of jobs. But then opened up so many more jobs as more people needed database and dynamic tools.

It seems like we have just the kicked the can further down the road into a larger hole of needs.

My prediction: computers that learn on much, much smaller models. Causing more models to be created for an ever increasing use of cases. We have opened up a Pandora’s box.


👤 mostertoaster
As I’ve heard we will probably overestimate what will happen in the next five years and underestimate what will happen in the next 10.

How much did the internet itself change coding, not a lot, then a whole bunch.

I think since the industrial revolution the most valuable jobs are those that require the most distinctly human traits. With AI doing things we typically only thought humans could do, that will shift.

Soon we will have interviews where you are expected to use chat GPT to solve the interview question.


👤 mstaoru
I don't know, but it would be interesting to see someone recreate the whole underlying backend complexity from ground up using AI only, and run a simple Twitter clone. I mean server firmware, BIOS, bootloader, OS, kernel, networking stack, web server, DB server, message bus, application server - all AI-made from ground up.

👤 satvikpendem
I probably wrote on the order of 50 lines of actual code in the last 2 weeks. Do you know why? Most of that was actually time discussing what code should be changed, gathering requirements, figuring out how to change the code, and then finally actually changing the code.

👤 throwmeaway2232
if LLM's improve your productivity, you'll be more likely to start your own startup. Why let your employer reap the benefits of your improved productivity?

Optimistically, no one would need to work as an SWE, since they could bootstrap companies at the speed of light.


👤 trashface
I'm not an optimist but I think younger programmers will still continue to find employment and they will be able to use AI to write code in new ways. Employers have always been interested in hiring young people and that won't change.

👤 pdimitar
My optimistic take is that you shouldn't care because the only thing LLMs will do is take away the job of generating small and easily verifiable boilerplate.

Everything else remains the same.

Don't try to over-prepare for a future that is not coming anytime soon.


👤 haakonhr
My optimistic take is that more time will be spent thinking about specification and validation. I also hope that code synthesis tools and verification becomes more common to use.

👤 flappyeagle
The most optimistic take is that it's like any other breakthrough technology, and creates many new (SWE) jobs because of the vast many new applications that it enables.

👤 kirti23
nice