The picture of software development also looks completely different. Code that used to be readable in a few lines becomes 100 lines—overblown because, well, code is cheap. Now, I could argue that it makes things unreadable and so on, but honestly, who cares? Right? The AI can fix it if it breaks...
So what do you guys think? Is this the future? Maybe the skill to focus on is orchestrating AI, and if you don’t do that, you become a legacy developer—someone with COBOL-like skills—still needed, but from the past millennium.
Juniors don't have that skillset yet, but they're being pushed to use AI because their peers are using it. Where do you draw the line?
What will happen when the current senior developers start retiring? What will happen when a new technology shows up that LLMs don't have human-written code to be trained on? Will pure LLM reasoning and generated agent skills be enough to bridge the gap?
It's all very interesting questions about the future of the development process.
AI systems look at code on the internet that was written by humans. This is smart, clean code. And they learn from it. What they produce — unreadable spaghetti code — is the maximum they can squeeze out of the best code written by humans.
In the near future, AI-generated code will flood the internet, and AI will start training on its own code. On the other hand, juniors will forget how to write good code.
And when these two factors come together in the near future, I honestly don’t know what will happen to the industry.
Yes. The feature is quickly produced slop. Future LLMs will train on it too, getting even more sloppy. And "fresh out of uni juniors" and "outsourced my work to AI" seniors wont know any better.
Is this because the guys claiming success are working in popular, known, more limited areas like Javascript in web pages, and the people outside those, with more complex systems, don't get the same results ?
I also note that most of the "Don't code any more" guys have AI tools of their own to promote...