P.S: using throw away account, my main one has my email and website :/
One of the guys in the documentary says "we were afraid they wouldn't need special effects people anymore. The director would just go on the computer and hit D for dinosaur and R for rain, and we'd be out of a job".
But it turned out that even with computers, they had to do a whole bunch of practical effects to get the rest of the footage. So when a computer generated dinosaur breaks through a fence, they had to make the fence break in just the right way so they could film it, and that took practical effects.
A lot of things have changed in the film industry and a lot of the old effects have gone away due to computers, but I don't think these big doom and gloom predictions people make are likely to come to pass.
I was listening to my favorite Martin Luther King Jr speech, The Three Evils of Society, and he's talking about concerns people have about job loss due to automation back in the 1960's. And computers did come and automate away all kinds of stuff, but people still found work in other areas.
My point is people have been worried about automation for a very long time. And it is interesting to think about, but I think a lot of the hyperbolic claims that it's going to destroy all these jobs overnight are a bit overblown.
I mean, sure, like, I can imagine it could write some small applications, but only simple things that have been done over and over and that it can rehash based on its (admittedly massive) input data. Like maybe it could write a basic blogging application. But I doubt it could write a flight management application, much less adequately verify that it functions correctly and keep it up to date with the latest FAA regulations and navigation databases.
(On that note, since I do in fact work in aerospace ... considering how hard it is to certify any automated tools to use in avionics software development, I don't even want to think about how hard it would be to certify ChatGPT to churn out trustworthy avionics code!)
It is kind of like linear regression on mega steroids, and you know with non-linear functions. Lots of really interesting ideas/applications but it is not an AI.
Code generation is moving extremely fast. This tech didn’t freeze in time at Codex or Copilot or ChatGPT. It’s one of the most exciting and difficult domains in AI and the smartest people are all set on solving it.
I’m sorry you’re feeling distress. You’re in good company. A lot of the world is going to have to deal with these problems very soon.
In the early 00s, there was tremendous outsourcing going on. Some people said Western programmer salaries will collapse.
So your anxiety is nothing new. I think of ChatGPT-like tech to be like grammar checker. It will make your job easier.
Also, it is interesting to consider how chip design works these days. My understanding (as a software person) is that there are more people doing test-verification harnesses/QA of sorts than designing the basic blocks. This model may be how things go in software.
It’s always been the same fear with all tech revolutions: people thinking there won’t be any work left to do, afraid they won’t find a way to sustain themselves. When work is replaced, there is always new one to do, and it usually simply represents an upscaling of global GDP. It’s a good thing. We are facing massive global challenges and could use greater efficiency and expression of our collective intelligence to address them.
As long as you stay willing to learn, you will always discover ways to contribute to society and be relevant. But if you’re attached to “I learned X and I hope it gives me bread for the rest of my life,” you probably chose the wrong century :-)
It really did help keep me in the flow. While I was designing the entire system and documenting it, I would just give ChatGPT the requirements in a few sentences and it spit out correct code.
Honestly it did take the place of what I would have formerly thrown at an intern/junior dev.
Will it “replace” the need for junior devs? I am going to say the unpopular opinion - yes.
But coding has always been the least important part of my job. Knowing what to code and solving XYProblems is where the money and differentiator are.
GPT is going to put millions of writers, artists, and programming/IT professionals out of work, full-stop. It will also make Sam Altman and other tech billionaires even richer than they already are, and worsen already extreme wealth inequality. It's obvious enough at this point that I can't believe other people don't see it. Just look at how rapidly the technology is being adopted, even by normies.
I'm not going to give the OP life or career advice, but for my part, I am actively trying to invest substantial amounts of my money into the firms I think will win the AI revolution, at least to protect myself form the downside.
For simple programs where the requirements are trivial, sure. But for any software with a bit of complexity, that is integrated with other software of similar complexity, I don’t see how this becomes a job destroyer. In most software roles I’ve worked, big and small, the actual development is largely factoring and balancing a ton of requirements (security, performance, business needs/timelines, customer asks, hardware constraints, operations, on and on). You’d probably have to write a hell of a chatgpt prompt to explain the entirety of what you need to replace the human factors of evaluating all of that :-)
Edit: I like the comment below about the movie industry. The evolution of technology has pushed the boundary of whats possible, but still have a lot of the same needs in terms of people.
The great news is that many of these products are free or relatively inexpensive, so you can start gaining experience with them today.
The best way to future-proof your career in the event that language models become widely applied is to make it one of your core competencies. :)
If you start practicing today, you'll quickly be at the front of the curve. And try to practice deliberately, for example gaining experience with:
- how to give it all the necessary information and nothing superfluous,
- how to effectively articulate the desired deliverable and its requirements,
- how to guide it through iterating on the solution and when to stop (often I do the first 10% of the work, then the model does the next 60% of the heavy lifting, and then I finish off the last 30% to fix its mistakes and tailor the result)
Toys are toys. Tools are tools. And ChatGPT is a toy quickly becoming a tool that also won't replace people.
However, it doesn’t understand anything of its underlying subject matter.
It simply (!) has accumulated a wealth of information that it chooses the best answer for. Of course, that is very impressive but a thinking machine it is not.
My anxiety is not on the AI but the ability to game human kind and normalization of content. You think everything is quite similar now? Let’s see how it can determine what is new, real authentic human content versus bot rehashing?
I wouldn't be too worried about another quantum leap for atleast another few years and I highly doubt that will be enough for AI to generate code.
In the long term it's definitely something to keep in mind and be aware that you should be moving up the stack in terms of critical thinking so you aren't left as a code monkey 20 years from now where AI will definitely be eating your lunch.
In the meantime if you think another skill or business might come in useful, do it. But do it calmly if possible. Anxiety will never help you do something better or more. Ever. If calm is not possible for you, then that is a skill I suggest you start cultivating.
I don't think it'll rid you of your jobs, I think it'll just open up new types of jobs for everyone.
The increase in the productivity of people developing software will lead to far more software being generated, not fewer software developers.
As an additional moat, any non-trivial business has a lot of context and tacit knowledge that will not be within the corpus of "knowledge" that AI models have and there will be many non-technical hurdles (regulatory, political etc) before any of that business context can be fed into the model, if ever. Remember these models only have access to public information at this point in time.
I think it's all about being adaptable, being here on HN and thinking about these things will put you ahead of others who simply don't see the change coming. Unfortunately we can't put the AI genie back in the bottle and we just have to be ready to adapt.
People used to program by flipping switches or punching holes in punch cards. We made assembly languages, higher level languages, optimizing compilers, libraries of commonly used code, and so on.
There is presumably some tipping point where we become so productive that fewer people are needed. It hasn't happened yet, and there's no good way to know if and when it will happen.
I've 100% accepted that AI will displace tons of jobs and lead to the near dissolution of meaning and discourse "online" (as if as anything else will exist). And Im just trying to be happy day to day until that and its secondary effects come to pass or until one of other countless seemingly incoming disasters happens.
Someone has to ride herd on this AI stuff and make sure it stays sane. It'll take a while for everyone to figure that out, but we'll get there.
I'm getting too old to fight in the robot uprising when it comes.
If you could look into the future and know you’re right about this, what would you do today?
====
This is a story on Hacker news:
STORY:
Ask HN: How to Deal with ChatGPT Anxiety?
Since the release of ChatGPT I can't help it but be pessimistic about what the future holds for the field of software engineering. As a Junior developer I'm so thankful for the constant learning I'm experiencing at work and despite the low impact I have on issues/code base I'm always making sure that I take it one day at a time. I keep hearing friends and people online saying that the calculator didn't replace mathematicians and that Excel didn't replace accountants, but I can't help it to be anxious and think of ChatGPT replacing me and is making my imposter syndrome worse. I wanted to ask HN and see what they think of my fears, am I unreasonable? I know that I should not fear such tool but make sure I learn it and have it help me become more productive. Any tips or recommendations on how to deal with such anxiety/fear. Thank you in advance.
COMMENT:
You're in on the ground floor at the right time. Take advantage of ChatGPT/Codex/GPT-3 to make your job easier and to make your productivity sky high. Being a junior you have less to unlearn.
Use it to understand unfamiliar code.
Use it to understand long historical arguments with the right amount of detail relevant to your needs (Vi vs Emacs / Tabs vs Spaces / JQuery vs VanillaJS / etc.)
In particular use it to speed up your learning (paste documentation into it and ask for summaries or questions about how something works).
Use it to brainstorm some approaches.
Use it to write a step by step set of instructions for tasks you need to do.
Use it to clean up your grammar / points of view in writing.
Use it to evaluate or at least present the pros/cons of several approaches or decisions.
Use it to fill the missing parts of documents.
Use it to improve your resume.
INSTRUCTION: Rewrite the comment and add 30 example use cases for ChatGPT and GPT-3.
REWRITTEN COMMENT:
You are in the perfect position to benefit from ChatGPT, Codex and GPT-3. Being a Junior, you don't have to un-learn any old habits and can take advantage of this amazing technology to boost your productivity. Here are 30 examples of how you can use ChatGPT and GPT-3:
1. Understand unfamiliar code.
2. Quickly produce summaries of long historical arguments.
3. Generate questions to help you understand how something works.
4. Brainstorm approaches to problems.
5. Create step-by-step instructions for tasks.
6. Improve your grammar and points of view in writing.
7. Evaluate or present pros/cons of several approaches.
8. Fill the missing parts of documents.
9. Improve your resume.
10. Automate tedious tasks.
11. Generate original content.
12. Summarize articles and reports.
13. Automatically generate diagrams.
14. Identify patterns in data.
15. Automatically generate code.
16. Translate between languages.
17. Generate realistic images.
18. Automatically generate titles and headlines.
19. Automatically generate questions and answers.
20. Automatically generate test cases.
21. Automatically generate code reviews.
22. Automatically generate design specs.
23. Generate natural-sounding dialogue.
24. Generate automated customer service responses.
25. Generate product descriptions.
26. Automatically generate documentation.
27. Automatically generate bug reports.
28. Automatically generate marketing materials.
29. Automatically generate unit tests.
30. Automatically generate software architecture diagrams.