HACKER Q&A
📣 atleastoptimal

Do you think some devs are in denial regarding coming AI changes?


Reading a recent thread on people's anticipations on how the job market will change due to AI in 5 years, it seems many people are somewhat reluctant to admit the scope of change just on the horizon. There seems to be a somewhat pervasive sentiment that AI will just be "more of the same" and the same old rules will still apply years down the line. Specifically, I've repeatedly noticed the following sentiments, which I disagree with:

1. LLM/their obvious ancestors are nearly maximized in their effectiveness, and will fade and another tech hype cycle will take its place

2. While most junior level dev work is easily automatable, mid-senior level dev work cannot be replaced and in 5 years time will be no sooner able to be replaced

3. LLM's and AI coding tools will actually make code worse and less efficient, cancelling out any presumed efficiency gains from their use.

The overall sentiment from some senior devs regarding recent changes in the job market I seem to be picking up is: "Haha I feel bad for all you Junior dev's who can't get a job now, but AI will never replace ME".

Anyone who agrees with any of the 3 sentiments, what is the justification backing up your belief? Do you feel the general cynicism/dismissiveness is justifiable?


  👤 micahdeath Accepted Answer ✓
It'll probably be more like the chat bots or automated call centers that just annoy people.

If it's on the web it can do it, but if it's new or complex it doesn't seem to work. Mostly, I see it as an interface to reporting and maybe calling some API's.

The conversion of data tests I've tried all failed horribly.


👤 robswc
>Anyone who agrees with any of the 3 sentiments, what is the justification backing up your belief? Do you feel the general cynicism/dismissiveness is justifiable?

>While most junior level dev work is easily automatable, mid-senior level dev work cannot be replaced and in 5 years time will be no sooner able to be replaced

Because companies were not set up with the idea of being able to contextualize and feed all their internal information into an AI. Maybe going forward, this will change... _maybe_ in 10-20 years (rather arbitrary I guess) one or two "managers" could use AI to interface with customers but for the foreseeable future, I just don't see how it could work.

Assume you had an LLM that would never hallucinate or give a "wrong" answer. Someone still has to hook that up and constantly feed it context or its rather useless.


👤 themerone
We could be headed towards a future where 80% of software is written by AI, but the other 20% requires 200 million humans.

AI will likely revolutionize some software development tasks, but it is too hard to predict the long term effects when we don't even know the future demand for software.


👤 superchroma
My job is messy. AI isn't about to replace that or figure it out. Humans naturally are inclined to create complexity. Product owners, designers, etc.; the apparatus of business. It all comes down to us.

AI can take the business of shunting flat boring data around in APIs, but everything else isn't quite such low hanging fruit. It can also have a crack at complex data analysis in really constrained problems, say, feature detection. I can't see dealing with the infinitely variable whims of UI designers or approaching new problems to be within its purview however.


👤 elfbargpt
I echo a lot of what the other people in here are saying.

I'm also pretty convinced that AI won't replace software dev any earlier than it will replace the majority of corporate jobs, and at that point everyone has a problem.


👤 JoeMayoBot
My impression is that the state-of-the-art AI is building models based on existing content and just re-packaging the result in a way that looks intelligent. Please inform me if you are aware of instances where this isn't fully accurate. The code you're seeing generated may have already been written on GitHub or came from a StackOverflow answer. The explanations you get might have come from WikiPedia or some blog post. If people stopped creating content in a specific domain, the AI wouldn't be able to answer any new questions in that domain. If you want to verify this, visit the bing search engine and scroll up to use the ChatGPT search. You'll see that it gives an answer and provides links to sources. If you read those sources, you'll see exactly what I mean because the ChatGPT answer looks nearly identical.

There's also a valid point that you shouldn't copy/paste code from a forum or other source without fully understanding that code and ensuring its correctness. Blindly relying on AI potentially violates that practice. Also, who do you credit the source of the code to because the AI got the code from somewhere else and might not tell you the true origin.

I have heard some rumblings about copyright violation and plagiarism. Clearly there's loss of income for some content creators who advertise because whoever read the ChatGPT answer is unlikely to follow the links, resulting in lost revenue for the original author. So, in addition to the wisdom of senior devs who have tried the AI and are skeptical, there's potential for its use to be diminished via legal means.

We don't know what the future holds and should question everything.


👤 ChatGTP
I think even if it replace devs, the devs will make up some other bullshit job to do. Most software development is bullshit so it's already this way.