Generating a snippet, or even entire blocks, of code is not a good metric on which to base the quality of LLMs.
Rather, we should pay attention to how good is an LLM at altering, editing, maintaining code? In practice, it's quite poor. And not in the way that a new developer is poor, but in weird and uncertain and unpredictable ways, or in ways that it fixed in the last prompt but broke in the new prompt.
Engineers spend little time writing code compared to the amount of time they spend thinking and designing and buildings systems for that code to operate in. LLMs are terrible at systems.
"Systems" implies layers, and layers implies constraints, and constraints implies specificity, and specificity is not a strong suit of an LLM.
The systems in which most code is written is not written once and never touched again. Instead, it passes through many layers (automated and manual) to reach some arbitrary level of correctness across many vectors, in many contexts, some digital and some not. LLMs are passable at solving problems of one, maybe two, layers of complexity as long as all the context is digitized first.
Lastly, not all problems have a technical solution. In fact, creating a technical solution for a non-technical problem so that we can apply an LLM to it will only cause layers of unnecessary abstraction that locks out "normal" people from participating in the solution.
And we may not like to admit it, but we engineers often have work to do for a job we like for pay that's pretty good, only because there are a whole lot of non-programmers out there solving the non-technical problems that we don't want to do.
So we don't care if LLMs replace coders by auto generating code. All the better for us. The people who are trying to support families ... by offering the world to write CRUD or maintaining codebases are doomed.... they are going to end up as the homeless guy on the street holding a sign saying "will code html for food"
On the other hand, as someone whose role gives me visibility into the way senior leaders at the company think about what AI will be able to do, I’m absolutely terrified that they’re going to detonate the company by massively over-investing in AI before it’s proven and by forcing everyone to distort their roadmaps around some truly unhinged claims about what AI is going to do in the future.
CEOs and senior corporate leaders don’t understand what this technology is and have always dreamed of a world where they didn’t need engineers (or anyone else who actually knows how to make stuff) but instead could turn the whole company into a big “Done” button that just pops out real versions of their buzzword filled fever dreams. This makes them the worst possible rubes for some the AI over-promising — and eager to make up their own!
Between this and the really crazy over-valuations we’re already seeing in companies like Nvidia, I’m seeing the risk of a truly catastrophic 2000- or 2008-style economic crash rising rapidly and am starting to prepare myself for that scenario in the next 2-5 years.
You can literally estimate the proficiency and experience level here in comments.
People that have seen some serious sh*t probably won't even bother answering the question.
As long as an understanding of the domain and problem is required a skilled person is required between the codebase and the LLM.
Even if we get to a point in the future where a programmer can be replaced by an LLM, we'll have businesses where they want someone to use the LLMs to create software.
I started my career over a decade ago. I knew PHP and MySQL only. Then I moved onto other companies, learnt other stacks (frontend, node, postgres, cloud, go, python, datadog, ci/cd, docker, k8s, aws, etc.). Nowadays it's not hard for me to find a new job. But what would have happened if I stayed with PHP and MySQL and never learnt anything else? I would be jobless.
So, I'm not afraid of LLMs replacing me. I think they will open the door to more jobs in IT, actually.
I just need to keep learning (and AI has nothing to do with this).
It needed a lot of work to get it to work. That’s not to say that it couldn’t have been coerced into being right through prompts, but it’s a reminder that LLMs are not thinking. You still need someone who understands not only what is wanted, but why it looks the way it does.
I suspect that most of this stuff will end up like syntactic sugar: Something that makes our work easier, but not fundamentally replace us.
In the end we need humans as final sanity check or over all design for good while. Or just to decide the right requirements.
I think with enough scale and more organized and structured use of them, they (or another soon-to-come AI breakthrough) will outdo a human mind.
GPT-5 will be significantly better at coding, to the point where it might no longer make any sense to hire junior developers.
And this is just GPT-5, this year. Next year there will be GPT-6, or an equivalent from Google or Anthropic, and at that point I fully expect a lot of people everywhere getting the boot. Sometime next year I expect these powerful models will start effectively controlling robots, and that will start the process of automation of a lot of physical work.
So, to summarize, you have at best 2 years left as a software engineer. After that we can hope there will be some new types of professions that we could pivot to, but I’m struggling to think what could people possibly do better than GPT-6, so I’m not optimistic. I’d love for someone to provide a convincing argument why there would be any delay to the timeline I outlined above.
p.s. I just looked at the other 20 responses in this thread, and it seems that every single one is based on current (GPT-4) LLM capabilities. Do people seriously not see any progress happening in the nearest future? Why? I’m utterly baffled by this.
https://www.bls.gov/ooh/computer-and-information-technology/...
This is a pretty drastic change from their earlier prediction (often cited all over the internet) that the number of software engineers will increase by more than 30% in the next 10 years. It's not the full replacement of software engineers I'm worried about, so much as the steep reduction in the number of jobs and the labor/wage pressure that will make this job pay a fraction of what it's paying now, and make everyone's livelihoods more precarious in the next 10 to 15 years.
Karpathy already stated in 2017 that "Gradient Descent writes better code than you", when he wrote about "Software 2.0" as feeding data to neural networks: https://karpathy.medium.com/software-2-0-a64152b37c35 Nvidia's CEO, Jensen Huang, seemed to have confirmed that point this week in persuading parents not to encourage their kids to learn to code.
Today, this YT video by a dev named Will Iverson about how software engineering jobs are not coming back made me really anxious, and start to worry about making backup career plans in case I need to transition in my late thirties / early forties. (That sounds sooo hard...I'm a recently laid off mid-level full stack engineer of seven years, but I wonder if it would be better to transition now while I'm younger. Why wait 10 to 15 years to become increasingly obsolete or more stressed of becoming laid off? How can I support a family like that? Or make any plans into the future that might impact other people I'm responsible for?) https://www.youtube.com/watch?v=6JX5ZO19hiE&t=3s
I don't think the industry will ever really be the same again. But I'm sure a lot of us will adapt. Some of us won't, and will probably have to switch careers. I always thought I could at least make it to retirement in this profession, by continually learning a few new skills each year as new tech frameworks emerge but the fundamentals stay the same -- now I'm not so sure.
If you think I'm wrong, can you please help me not be anxious? Older devs, how have you managed to ride out all the changes in the industry over the last few decades? Does this wave of AI innovations feel different than earlier boom-bust cycles like the DotCom Bubble, or more of the same?
What advice would you give to junior or mid-level software engineers, or college grads trying to break into the industry right now, who have been failing completely at getting a foot in the door in the last 12 months, when they would have been considered good hires just two or three years before?