People need programming tools to generate correct code so I'd think there needs to be something that couples the creative ability of neutral networks to logical tools such as theorem provers.
AlphaGo uses a strategy like this in that it is based on Markov Chain Monte Carlo search, a basically "old AI" method that simulates a vast number of games played out to the end. You could play Go pretty well using random playouts, but AlphaGo plays much better because it generates playouts with a neural network that picks better than average moves by a flat analysis of the board.
The MCMC + network plays better than either the MCMC or the network because the MCMC supplies the deep analysis that the network can't do.
I see a lot of papers too where people are doing some modelling (say of proteins) or optimization (energy transport and storage grid, creating new materials including proteins) where there is a similar combination of a neural net with other software architectures and I'd expect to see it in software eng too.
Really?
How would you incorporate GPT into a compiler? or a code formatter? or a linter? or a debugger?
I am not saying that it is not possible to use GPT features in programming tools, but saying that "programming tools will need to make use of this tech" seems like a stretch.
Code generation seems like an obvious use case, but that does not mean that the code will be correct for the use case and the generated code may still need to be debugged.
When Bitcoin and blockchain were all the rage a few years ago, lots of companies rushed to create blockchain-based tools for everything. Most of those tools never really took off because blockchain was an ill fit for those applications and did not offer substantial advantages over the existing technologies.
Sure GPT is interesting and exciting, but it does not mean that practical use cases for it exist everywhere and in all domains.