It helps you code somewhat more efficiently, like many of the other tools we have, but doesn’t help with the hard parts of software development.
Part of engineering software is reducing boilerplate. You can only rely on LLMs for easily-verifiable or well-understood code. Most easily-verifiable code is boilerplate and most well-understood code is already in a reusable library.
With that in mind I think the following is plausible: how we write code will change, efficiency will improve, product skills will become more important, people at the far right of the skill spectrum will see their salaries increase even more.
SWEs are already have insanely high ROIs and the demand for more software is nearly boundless. Even at many large companies products move slower than we want because even top of market engineers can't work that fast and there are limits to how parallelizable the work is. If you can double code output of individual engineers it's better than doubling the size of the org while keeping costs stable.
For some reason though, even more were created.
I spent a week of time in my first internship creating a marketing page. I made one that works 10x better, looks 10x nicer, and works on mobile (this was back before there were smartphones) in an afternoon with LeadPages.
I think if you're good with people and can get solutions across the finish line you'll always have work.
- More software will be consumed (Jevons paradox)
- Therefore more demand for skilled engineers
- And the LLMs will not be able to build things end to end
- Salaries at the top end will rise, since an AI augmented senior engineer can deliver more value
Far future: no more software engineers, but also no more human work of any kind. Post scarcity society.
I think AI will be the exception to this. Going to be a lot of SWE like support roles around AI. MLOps, glue code, etc. AI people really do not enjoy this type of work. And comp for AI work is even crazier than top FAANG jobs used to be. (top salaries being reported in ranges of $5M-$20M/yr)
As we move out of the current macro and the job market in general improves (may take some years for this one) I think we'll still see a lot of LLM enabled developer jobs.
Being a mathematician used to mean doing a lot of calculating. We have calculators to do that for us now but there are still mathematicians.
Being a developer used to mean doing a lot of programming. We have programmers now (GPT-4) but we'll still have developers driving them (in my opinion).
There were similar panics when things like dynamically typed languages came along, or the idea of programming on a computer rather than punch cards. If you've been doing it for a while it seems like with these developments "anyone" will be able to do it. Which, maybe thats true. Maybe "anyone" could do it now even with some study and effort. But so far as I can tell most people still do not want to do software development even if it's LLM enabled. In fact even some people in tech don't want to use LLMs.
I think what we have now are just way better tools. Power tools if you will. We're still carpenters. We just don't have to turn the screwdriver anymore.
That will open many opportunities for people to do things they always wanted to do, but did not have the time to do before.
This is a golden age in some ways.
Big software companies will have to decide between taking more projects or cutting staff. I think many people expect them to cut staff. I expect them to take on more projects.
Entirely new SWE Management / Project Management methodologies will develop as currently less qualified SWEs learn quickly using AIs and become much more productive using AIs.
I remember legends about companies encouraging engineers to spend a few days a week on their own projects. If I were running a SWE company, I would (1) encourage everyone to share efficiency hacks that involve AI, and (2) encourage everyone to spend a few hours a week on their own projects. In the medium term, I would not lay off people; I would assign them to new projects.
But I'm crazy. I though the Internet was a fad and that we'd all be back to BBSes in 6 months.
However this will only up the bar as now that software is easier to make more people want higher quality customized solutions, therefore increasing demand even further.
In the end consumers will have better quality software for the same price.
I don't know yet what that looks like. From my interactions with ChatGPT, it's very good at recalling, but not so good at thinking, if that makes sense.
So yeah, if your job mostly consists of what can be copy and pasted off of stack overflow, I'd start to sweat a bit.
2. Even once we start using LLMs and such for some aspect of safety-critical software development, we're unlikely to use it for everything. E.g., if we let an LLM write code, we probably won't use an LLM to also review and test the code. So humans are still needed there, and they have to be competent enough to usefully review the code.
3. In my own role as a principal engineer, I find myself more interested in the bigger picture of software design and customer satisfaction. I write less code already, delegating to others, and I'm okay with that. The idea of having a tool that could write code for me, even if not 100% of it, leaving me more time to plan new features and products, seems appealing. (Even if I personally can't actually use the tool myself yet, due to (1), the idea in concept sounds good, and can be leveraged in other industries sooner.)
4. I think it really remains to be seen to what extent LLMs will be able to completely take over the software development process. From my own world, if I were to ask an LLM to "write flight management software suitable for a Cessna Citation X", well, I don't expect usable results at this point. I would anticipate that I would have to break the problem down into sufficiently small, well-understood chunks that we probably wouldn't really be eliminating that many humans from the process. There's a big difference, I think, between writing a 1000-line program that is heavily influenced by numerous examples of well-known, well-documented code, and writing a 1,000,000-line program that does things that are more obscure.
5. I hear lots of software developers talk about how awesome LLMs in relation to getting answers from StackOverflow. It sounds to me like some of these folks spend a lot of time snarfing StackOverflow to do their job. I personally have barely ever found the answers to my work problems on StackOverflow. My own first-hand experiences with LLMs so far suggest that they could help me reduce some boring boilerplate code, and help me to discern some poorly-written API documentation, but most of what I work on I just don't see it helping me with so far. I suppose that how much LLMs can replace one's job may depend on to what extent one's job actually is copy-pasting from StackOverflow.
But really the most important part of software engineering is the iteration and feedback loops at different levels. It's not that design or other considerations aren't important but without the closed loops you don't know what you are doing is effective. I think that design and code review also goes in the category of feedback loops.
SWE will be about choosing the virtual team of engineers and testers, connecting them and feeding them the right instructions, and connecting that virtual team in closed loops with real world users.
It seems very unlikely that humans will be able to keep up with the AI in software design, architecture, implementation, etc. after the next year or two. Its possible that progress will stop but there is no reason to believe that.
I find it very easy to solve problems, but tedious to broadly apply solutions across domains. I'm also very sloppy as a programmer, letting my mind wander further into the future problems to the determent of the current task. Having an LLM buddy to codify my thoughts, regularize them and apply them with high precision would make me MUCH more productive.
In the end, it may be that LLMs are simply better programmers than 99.999% of people. But there will always be need for specialists to bridge between the LLM and some other domain and programmers of today will be that bridge.
And if not... then AGI will have eaten us all up to make paper clips anyway.
Your fundamental assumption is that there’s a finite amount of software work. Had this been the case we’d have seen signs of it plateauing over the last 50 years. Until we have true AGI, LLM’s are just an another tool to increase what humans are capable of. Historically this means humans just ask for more stuff.
I feel like this is the WordPress/Blogger.com moment for coding. There should be an explosion in the amount of apps and products.
As someone else said, web wysiwig editors took a lot of jobs. But then opened up so many more jobs as more people needed database and dynamic tools.
It seems like we have just the kicked the can further down the road into a larger hole of needs.
My prediction: computers that learn on much, much smaller models. Causing more models to be created for an ever increasing use of cases. We have opened up a Pandora’s box.
How much did the internet itself change coding, not a lot, then a whole bunch.
I think since the industrial revolution the most valuable jobs are those that require the most distinctly human traits. With AI doing things we typically only thought humans could do, that will shift.
Soon we will have interviews where you are expected to use chat GPT to solve the interview question.
Optimistically, no one would need to work as an SWE, since they could bootstrap companies at the speed of light.
Everything else remains the same.
Don't try to over-prepare for a future that is not coming anytime soon.