1. LLM/their obvious ancestors are nearly maximized in their effectiveness, and will fade and another tech hype cycle will take its place
2. While most junior level dev work is easily automatable, mid-senior level dev work cannot be replaced and in 5 years time will be no sooner able to be replaced
3. LLM's and AI coding tools will actually make code worse and less efficient, cancelling out any presumed efficiency gains from their use.
The overall sentiment from some senior devs regarding recent changes in the job market I seem to be picking up is: "Haha I feel bad for all you Junior dev's who can't get a job now, but AI will never replace ME".
Anyone who agrees with any of the 3 sentiments, what is the justification backing up your belief? Do you feel the general cynicism/dismissiveness is justifiable?
If it's on the web it can do it, but if it's new or complex it doesn't seem to work. Mostly, I see it as an interface to reporting and maybe calling some API's.
The conversion of data tests I've tried all failed horribly.
>While most junior level dev work is easily automatable, mid-senior level dev work cannot be replaced and in 5 years time will be no sooner able to be replaced
Because companies were not set up with the idea of being able to contextualize and feed all their internal information into an AI. Maybe going forward, this will change... _maybe_ in 10-20 years (rather arbitrary I guess) one or two "managers" could use AI to interface with customers but for the foreseeable future, I just don't see how it could work.
Assume you had an LLM that would never hallucinate or give a "wrong" answer. Someone still has to hook that up and constantly feed it context or its rather useless.
AI will likely revolutionize some software development tasks, but it is too hard to predict the long term effects when we don't even know the future demand for software.
AI can take the business of shunting flat boring data around in APIs, but everything else isn't quite such low hanging fruit. It can also have a crack at complex data analysis in really constrained problems, say, feature detection. I can't see dealing with the infinitely variable whims of UI designers or approaching new problems to be within its purview however.
I'm also pretty convinced that AI won't replace software dev any earlier than it will replace the majority of corporate jobs, and at that point everyone has a problem.
There's also a valid point that you shouldn't copy/paste code from a forum or other source without fully understanding that code and ensuring its correctness. Blindly relying on AI potentially violates that practice. Also, who do you credit the source of the code to because the AI got the code from somewhere else and might not tell you the true origin.
I have heard some rumblings about copyright violation and plagiarism. Clearly there's loss of income for some content creators who advertise because whoever read the ChatGPT answer is unlikely to follow the links, resulting in lost revenue for the original author. So, in addition to the wisdom of senior devs who have tried the AI and are skeptical, there's potential for its use to be diminished via legal means.
We don't know what the future holds and should question everything.