I have a hard enough time using and trusting snippet engines or doing due diligence in implementing a solution I find on stack overflow.
Roughly 30% of developers in the 2023 stack overflow survey responded the same way which is the minority and I would guess will shrink every year https://survey.stackoverflow.co/2023/#section-sentiment-and-usage-ai-tools-in-the-development-process
Today, I don't think the tools are good enough to make a material difference. It may help a bad engineer tread water, but it won't take you from good to great. It may save you time writing basic boilerplate and individual functions, but I suspect 99% of engineers don't struggle with that. What's hard about our jobs is knowing how to orchestrate the whole thing and put structure around complexity. AI can't do that yet.
When I use it personally, it feels like a harder context switch trying to describe in english what I already know how to code. Then I still have to review the function to make sure it's accurate. It feels like a waste of time and an additional context switch.
Whenever the AI gets better, we'll have to use it to be productive I have no doubt. But the pool of engineers will change too - there will be a categories of engineers who can't debug the AI output and who still write crazy prompts.
Maybe I'm old, but I'll only be worried about AI when it can write and maintain a full app with no human intervention.
ChatGPT saves time writing short functions maybe 10-20 lines. This is maybe 10 minutes a day.
ChatGPT helps start/debug things outside my domain where I would normally need to read some guides before even starting. This doesn't happen every week, but when it does, it's hours saved.
Copilot saves a bit of time with simple 1-3 line suggestions where you would already know how to write it, or can immediately verify correctness. This is probably 10 minutes per day saved as well.
So, on average, I get maybe 30 minutes or so per day saved with close to zero downside. It's not going to make or break a career but it's a good chunk of time.
I wouldn't worry about it. AI tools are trendy right now and are being hyped far beyond their actual usefulness. I've seen this before with other things. Eventually, things will calm down and AI tools will become stable and just another standard tool that will provide incremental improvements.
That will be the time to try them out and see if they are useful.
Typing code is not where the value of a software developer comes from: it's the thinking, the planning, the accumulated experience, and the communication with other people where the most value emerges. You will not hurt your programming career.
Way back when IDEs were the hot tool, some developers kept using the command line and vim. That choice did not hurt their careers: I've worked with such developers and they do just fine. IDEs make a lot of things easier, but they don't make the difference between success and failure.
I complained to a friend recently that we're drowning in spam-tier programming tutorials, giving the example of basic web-app database design. There are hundreds of tutorials that all cover roughly the same bare minimum basics, and only the basics, with some of them less safe than others.
My friend put some of the queries through ChatGPT and what came back was... spam-tier crap that said to do one thing (separate user table and auth table) and in its example code did the other (put `password_hash` into the users table).
This is an objective downgrade from the ocean of spam tier crap. Previously there were 999 tutorials that only covered the bare basics with not enough deep dive or "why its this way and not that way" material. Now there's that status quo, but also transparently wrong.
This is not, I think, an isolated incident, but an illustrative one. So feel free to wait another year. If it's actually truly useful by then, I'm sure you'll notice and pick it up without trouble.
My main use case is to ask it other ways to approach solving problems I already understand. Sometimes it reveals interesting ideas others have already had, which I like. The other day I was using promises in the JS world to create a sort of sequence of who-knows-when-it’ll-finish tasks, and GPT pointed out that what I actually wanted was an async generator. Kind of obvious but also, not at all — people rarely reach for generators in the JavaScript world. So that was cool.
Despite a fair amount of experimentation and building basic stuff with the API, this has been my most valuable use case yet which has real rewards. I made a tool which uncovers solutions to error stacks I pipe into it (in VS Code or in a standalone situation where it listens to your app’s errors), and it was interesting, kind of worked, but was mostly just a time sink. It could be fine tuned and I’m sure Co Pilot X will solve this elegantly, but there’s no real magic underneath this. You don’t need to know it. You’d figure it out fairly quickly if you needed to.
The secret sauce in AI isn’t something you learn easily, on the other hand. That’s a career in and of itself, and it’s not trivial to pursue. For the time being, I think it’s totally fine not to adopt these tools as a developer. Just keep an open mind to it, explore when you feel interested, and continue focusing on doing what humans do well but machines don’t.
Just try it and see what the results are. You may be surprised how good they are.
And if you weren't planning on using SO or Google, then you don't need to think about it.
If I had to articulate why, it would be that I could feel it putting me in a sort of "begging" mindset rather than a "thinking" mindset.
It also entirely depends on what sort of career you want. There are a million flavors of software engineering careers. Some flavors will be affected much more than others.
Where I work, our manager announced their new AI policy: use it if you want, but the company won't be paying for it. 3/4ths of the engineers in my location have no plans to use it at all.
I found it quite a leap at first to trust ChatGPT to understand my questions and system but it really does give reasoned answers even if some of them are wrong. It’s worth knowing this and asking it questions about the code, in fact it’d be great to get definitions of functions used inline in the ChatGPT ui to be able to at least quickly check if the right parameters were being sent. Maybe this checking of lies is much harder for junior developers when they get the slightly broken output and struggle to refine it?
I highly recommend building a small project where you don't understand the technology completely using chatgpt to write the majority of the code and start filling in your knowledge gaps, expand the project alongside Gpt and your own knowledge so it goes from heavy lifter to support.
(In my case success with python and pyqt, with only a passing knowledge of python and none of pyqt or gui apps)
One issue I have is when I build a system I know the system, but AI built systems is like understanding a different programmer's work, and the effort that goes along with it.
It's a mighty useful tool for being a bridge of knowledge, can give amazingly detailed information, but overusing it will lead you into a dark corner, underusing it might leave you behind.
Is it possible you are not breaking the problem the right way? I.e get better at understanding what it can do for you and conveying to it in a better way what you want.
To me it seems like a huge handicap not using these. At the same time if you use it too much your skills may degrade overtime (maybe the skills that degrade are not worth holding onto in this new world anyway... we'll see).
It reliably acts like a junior dev who can sort of complete the statement or add the next N branches after you add a few lines. But you have to be good at spotting bugs or inefficient implementations quickly and probe it till it corrects it.
Coding tools, like Copilot, are completely useless on the other hand. I think they would need to be integrated with the compiler and the IDEs indexes to be of real value. Perhaps you would also need another network architecture than transformers to really understand the tree shaped nature of code as well.
I am not a programmer (yet). A long time ago, I wanted to get into photography. I researched equipment, effects, lighting, etc. Then I happened upon a blog run by a photographer.
His advice - Are you creative enough to know the subject, what to photograph, what kind of results you want? If yes, you could always start with an iPhone and once you hit limitations, you could augment it with equipment which you build over time. He did cite some great examples of flickr photography done on iPhones and I was sold to the idea.
IMO from a layman's perspective, we should (try to) perfect our trade, tools we choose later to whatever suits our style.
I'm some % more productive because of Copilot and especially because of ChatGPT(with GPT-4). No idea what the number is but it feels significant since these tools often helps me get unstuck, generate new ideas, explain code, propose solutions, do some trivial tasks etc.
I think the people who aren't curious enough to experiment with it, or have some mental block against it because they feel threatened, or can't use the tools correctly are doing themselves a disservice in the long run.
I'm also noticing a significant difference in the performance of colleagues who use these tools a lot vs the ones who don't go anywhere near them.
My answer is: "No". You are not hurting your career, but it is still something that could be fun to use.
I've found it a really mixed blessing. Great for getting the boilerplate out the way quickly and trying new frameworks; but it gets in the way a tonne, and the second guessing and context switching eats at the gains.
1. A better IntelliSense, and 2. Getting up to speed on things I have less experience with
Are you hurting yourself? Only you can answer that. I know I would be hurting myself if I didn't use all the tools that were useful.
Spend more time figuring out how to find problems in code.
Spend more time figuring out how to learn new things in the ways most effective for you.
That time will be repaid in your professional career.
Spend less time worrying about each year's obsessive fad. If it has positive value, it will be easier and safer to use in the future.
GPT4 is an enormous upgrade over standard ChatGPT (3.5 Turbo) for logic and code. I believe Code interpreter was created by taking the base model GPT4 (not the instruct model) and adding a "code layer".
AI tools that I've played with only get more useful the more you already know.
You need the traditional sorta experience to know when it's generating bullshit or garbage code.
These tools are clearly very powerful and likely will get more powerful. What they do would have been considered science fiction 3 years ago. Getting into the habit of asking your "assistant" to perform tasks seems like an investment which will give solid returns.
That being said, I don't think you are hurting yourself much today. But the trend is not in your favor.
PS: these tools are tremendously useful when creating documents. YMMV.
You will need to be very good at programming to be competitive against people that use AI coding tools.