It's like any tool, it multiplies the effort of a designer. Previously we had shovels, so we needed a lot of healthy guys to shovel dirt. Now we have earth movers, and we use a more trained person to operate it. But it also means there's a lot more projects that become viable, so more people will end up getting sucked into them.
Personally I'm looking forward to being able to specify software without having to deal with minor issues like forgetting what to import, off-by-ones, and that kind of thing. I can spend more time thinking about the requirements.
We are a long way from AIs that can operate independently. And until then the AI will just be a productivity tool. Maybe we will all become a hundred times more productive, and it will still not be the end of the programmer.
The real threat that I see is that the AI will leave the boring part to us: The code review of AI work. If I see this development I will plan for an early retirement.
Another important question is about scale, making a thousand times a 10 line application is not the same as making one 10000 lines application. How good will the AI be at that?
Right now it is the same as "googling the answer", just faster. Which is impressive, but far from career ending.
I think another important aspect is that a lot of SW development is building upon already existing code bases. One benchmark for an AI would definitely be its ability to take in an entirely novel code base and make edits to it. I would guess that any current language model, no matter how well trained can achieve that. The tendency for it to invent by itself seems to make that destined for failure.
>Also, What fields do you think will be safe from the AIs
Any industry which is more than a bit hesitant about change. E.g. aerospace and defense.
Mindless stuff like "add a column with birthdate in it" could be automated though.
I say this because look at some the garbage software produced by real humans/contractors that actually have reasoning capabilities, and it's still buggy and terrible.
There will be specific exceptions with specific medical tasks, though I can’t really forecast which or how many tasks. Being able to triage arrivals at A&E based on their symptoms is different from having a database of known pharmaceutical interactions, and both are very different from being able to tell which mole is cancerous and which is benign.
But overall, I expect the domains with limited or absent training sets to be those that go longest before being automated, and I think medicine has a lot of specific tasks in that category.
I do see lots of positives and use cases for more narrowly defined tools that help the programmer and make him/her more productive and powerful.
For example, I've been playing with a new Terminal app (Warp) the lets you type plain English at the prompt. It then translates it to the proper bash command using GPT3. It's brilliant, it works, and it doesn't put me out of a job. It just makes me more productive and more able to focus on the problems specific to my company.
And without coding skills, that boilerplate will break or fail to get stitched together, too.
I absolutely see it making some parts of the development process vastly more efficient though.
As you get more specialized, there is less example code, thus the AIs do worse. As you move to the cutting edge, there is almost no example code.
So things that have been done thousands of times, will get eaten by the AI coders first. Then it will move up the value chain, how fast it isn't clear.
Now earth-moving is one industry. Let's talk for a minute about another industry. Ice cutting. The ice trade was a huge industry in the 19th century. At its peak it employed 90,000 people in the United States alone. It was ended almost overnight by the invention of the refrigerator. Hardly anybody even remembers that it was a thing. The world went on without a thought. That is another possible outcome of automation. And as scary as it is, I'd still rather have my refrigerator.
So maybe the programming industry survives, and just employees fewer people, or maybe it goes the way of the ice trade, and in a century or two it will just be a footnote of history that people used to have to manually program computers.
Could AI reduce the need for programmers? Probably.. I could see some "low code / no code" services built around AI. They could make their own programming language and/or training data in a specific domain to be more easily digestible by a LLM.
I think the situation for programming is pretty similar to the AI art situation, tho arguably programming is the more difficult problem.
If AI assisted code is like this then no because it would waste more time than it saves.
My biggest concern is the competitive nature of engineering and pressure from the business side of software moves the industry to adopt it and instead of 3 zero day vulnerabilities there are 500. Failures being opportunities, nation states will see AI assisted code and try to figure out how to use it to their benefit, while you and I get a 15 second shout out during the next all hands meeting for spinning up the app the VP of fantasyland dreamed up.
Would you feel safe if your airbag system would be written by some statistical model? Or worse, if all layers in the stack for the airbag system would be written by some statistical model?
Having an almost correct implementation written by some "AI" at the highest level sounds reasonable at first thought, but now imagine having almost correct implementations at all levels. On average it should work, but it won't. Software isn't about getting things right on average. It is about getting things exactly right.
1) Peter Thiel’s 10x improvement concept from Zero-to-One.
2) The exponentially increasing rate of AI research and improvement.
additionally, a third point is relevant:
3) Max Tegmark’s idea that the best AI agents will have been generated by AI.
I think we’re seeing an ‘intelligence’ explosion begin to rustle from its sleep. It’s difficult to predict when it’s coming, but I don’t think SE’s are the ones with the safest jobs. The inflection point will be a ChatGPT that costs $10k/yr to run, as that’s basically a 10x improvement in the cost of devs.
No job will be safe and thats a good thing. People value themselfs to highly and i bet from experience all of us know at least one guy who could have been easily replaced with a shell script.
That is not meant in a way that that person is not able to do other things or is impaired in anyway, buutttt imagine how many jobs in general could be replaced by very small shell scripts. Coding aint gonnna be any different.
A "computer" used to literally be a person operating a machine or doing arithmetic in a bank.
"Programmer" will start to mean more like someone who knows how to interface with and connect up different AIs.
Many jobs will be mostly replaced with AI. There will be a lot of new jobs. Especially for people who integrate with AI via brain computer interfaces.
I'm more worried about everything becoming even more of a buggy mess than it already is. Wonder how good something like this can get at predicting weird edge cases.
If you can type a comment and the (current gen) AI assistant spits out a working implementation, that makes it almost a certainty that you could type it as a google question copy the StackOverflow answer. It does't invent anything novel. At least not yet.
Software is highly profitable because of its very low capital intensity. A proprietary productivity boosting AI would increase the capital intensity of software development which is bad for workers aka software engineers.
I.e. 10 LOC -> 100 LOC -> 1000 LOC -> 10K LOC -> 100K LOC.
I failed to see AI moving behind 1000 LOC. I.e. 10*100LOC << 1000LOC.
Also, the AI is statistical, it does not have any notion of the real world. I.e. he might know how to write a function, but it does not know why you need the function.
A new arm of Government will develop to regulate A.I. --- a need Elon M has been crowing about for some time. There's talk of writing spec for Wario bots along a gradient of danger levels.
The job of most programers is to understand a problem and to create a few kb of text per day. On this front GPT is already a million time more productive. Maybe it doesn’t understand all the subtleties, but I don’t see a reason why it can’t.
But if you look at most office, the job of people is to shuffle a few kb of text per day. Email to customers or suppliers, a bit of documentation, etc.
Add an humanoid robot for the manual tasks and one or two year of improvement of the AI.
Or it could transform swdev into an AI-centricindustry where people take care of everything the AI can not.