Would you consider starting to learn a programming language now that ChatGPT is here? And what path would you recommend to young people who want to become programmers. Machine learning? Prompt engineering?
I would be grateful if you could share your thoughts and opinions on this matter. Thanks.
I feel its the same with programming languages. This advice from Peter Norvig [1] was very helpful for me personally:
Learn at least a half dozen programming languages. Include one language that emphasizes class abstractions (like Java or C++), one that emphasizes functional abstraction (like Lisp or ML or Haskell), one that supports syntactic abstraction (like Lisp), one that supports declarative specifications (like Prolog or C++ templates), and one that emphasizes parallelism (like Clojure or Go).
[1] https://norvig.com/21-days.html#:~:text=Learn%20at%20least,C...).
You can see that has proven to be 100% accurate.
And the practical reason why I'd recommend learning these now: you'll never have time to play with them when you're not young anymore. The whole domain of knowledge they represent will forever remain a missed opportunity.
In my book Geometry for Programmers (https://www.manning.com/books/geometry-for-programmers), I also advocate investing in mathematical education and a computer algebra system. Any system. I propose SymPy but it's only because it's free and ridiculously simple to get started with.
The reason for this is also simple. Mathematical knowledge is non-perishable. ChatGPT can write boilerplate for you, and it any language too. But to solve a real-world problem with math, you need a computer algebra system to solve your equations, and your own head to compose these equations. That's something beyond the reach of LLMs.
Become a software engineer and use the best tools available to you to learn and do your job. This means LLMs and everything that was available before them like search, stackoverflow, documentation, forums, books, courses etc.
You don't need to work on machine learning in order to take advantage of it.
"Prompt engineer" is a bunch of bs, stop trying to make it happen. The language models are a useful tool, they're not your job. It's as stupid as someone claiming to be an "IDE engineer" or a "stackoverflow engineer".
If you're interested and programming is something you like, yes go learn a programming language and also learn software engineering topics.
Has any of the AI ChatGPT generated code pass quality control or patched to prevent exploits? If not then code for quality and security built into your work.
Yes I am learning Python.
For young people learning to code and studying how computers work are the best skills to learn.
Similarly, if you're doing something like generating medical/therapy notes, or generating SAT questions then GPT is great, but for transactions or cases where exact behavior is a requirement, then you sort of need a discrete set of instructions that only a programming language can provide. You're not going to see payments systems built by GPT in the short term at least.
Maybe I'd recommend stuff like healthcare since that's going to still need individuals for a long time.
You learn to be a software engineer. Language is just one of the obstacles to overcome on your way of delivering value or a product.
Nobody not even most of the people (let alone ai) will do this job for you. And this is what you get paid for. Not for typing compilable letters.
I think ChatGPT could be helpful with things like: "please convert this function written in language X to language Y in an idiomatic way"
chatgpt is really good at explanations for code