HACKER Q&A
📣 FinnLobsien

Does ChatGPT/AI enable anyone to code?


I'm non-technical and have never written code. There's a lot of talk on Twitter how AI means anyone can now easily code and build products and stuff.

Since a lot (most?) of HN is about technical stuff, I'd love to find out from y'all - how true is that?

Do you think that software engineering will go to a place where anyone can just say "help me build an app that does these three things" and I can just deploy that?

Or will a baseline of skill always be required?


  👤 oxfordmale Accepted Answer ✓
Have you ever done DIY? Things like replacing a tap could be done very quickly using YouTube. Why do people still call plumbers? Because reality is never as clean-cut as YouTube videos. Sometimes, you find the connection parts have completely corroded, and you can't undo them with your standard DIY tools; other times, you discover you have odd-sized connections (metrics versus non-metrics and vice versa). The real skill of a plumber is to be able to handle real-life problems.

It is the same for coding. Yes, you can ask it to generate basic code, and it will work. But my job as a software engineer isn't to write code. It is to deal with messy humans. Sure, ChatGPT can write an app that does three things. Does it work reliably on all different mobile phone devices? And do you only need three things, or do you want 16? Most clients know precisely what they want; they can't express it accurately in words in a single iteration. When you have delivered those three things, they will often ask for more. They will often ask for a hexagon-shaped room after you have built them a perfect circular-based one and then complain heavily if you quote it will take several weeks to refactor.


👤 naffenuf
Getting software built with ChatGPT depends quite a bit on your level of curiosity and ability to formulate your requests clearly. Tell it what you're trying to do as clearly as possible, with images if you have them, and start asking questions: "I'm not a programmer, but can we work together to implement this? Where would we start?" Ask it to write code in english (pseudocode) so you can read it. Break your application into the smallest pieces and build them one at a time so they work together. Once you make sure it's going to do what you want, ask what language to use and what tools to install. If it says you need a python script, ask it to write it. If you don't know how to run that on your computer, ask. If the interface isn't what you intended, show and tell it how to fix it. Get it? This process is slower than if you knew how to code yourself, but it's also a hell of a lot of fun and the LLM does a lot of the tedious stuff.

👤 giuliomagnifico
Baseline skill will be always required, you can ask ChatGPT to improve or correct your code, but if you have no idea where to start you can’t rely 100% on ChatGPT.

Sometimes the hallucinations it has can be dangerous, you can’t copy-paste all, you have to give a look to the code to avoid some nonsense operations.

This can be dangerous also because if ChatGPT writes an entire app and the app has a bug in the code, ChatGPT will replicate the same bug on all the other apps that rely on that code. And if one guy found a vulnerability in ChatGPT coding operations, it can be exploited on every app created with ChatGPT.

However it’s a super useful tool, especially when you have to write ripetitive code or change something like a variable, you can ask ChatGPT to do these repetitive tasks and it’s good at it.


👤 ogou
"easily code and build products and stuff"

Those are three different things. Most of these AI agents can produce working code in different languages. Whether that matches tightly defined business needs is a different matter. Writing code that makes a P5.js sketch that animates a cat is a neat trick, but not something you can build a product around. Building products is a broad enterprise and can be very difficult, even with a full team of highly skilled engineers. Google and Facebook have access to the best engineers on the market and even they make products that bomb or go way over budget and deadline. The "and stuff" part is usually the part that makes money. Logistics, financial infrastructure, marketing, marketplace differentiation, and much more go into successful business built on software. An AI agent can describe those things in plausible terms, but can't manifest any of it.

What these platforms CAN do is help prototype experiments quickly. Then you could take it to a Dev team and have it built properly. I think that is an interesting future and will allow more people to participate in the early stages of some projects.


👤 hpen
I think ChatGPT is just the next level search engine for coding. It doesn’t fundamentally change anything about building complex systems, just another tool.

I would never ship a production app built with ChatGPT. I need to know the intricacies of my code base. I wanna fix bugs in my head based on the architecture and specific implementation details. I need to know how fragile or not my system is and its weaknesses


👤 JSDevOps
Give it ago and let us know how you get on ;-) … at this point I’d be more worried about if your a project manager and your sole job is shuffling tickets around a Jira board or using a spreadsheet.

👤 muzani
It's not that easy. I mentor at hackathons. Early on, even with GPT-4, lots of people thought the key was thinking up good ideas and getting LLMs to build them. Most still didn't get their apps up by day 2. There's definitely a baseline of skill.

It certainly helps with deployment though, especially things like why import vs require, where to find the API keys, which class to look inside, and so on. I would say we're twice as productive now. If it took you 3 days to do hello world, it would take maybe 1 now.

You definitely still need to read the docs. AI is just great at telling you which docs to read. But it may lead you astray if you choose not to.

Another gotcha is that LLM is the average of the above average skill. It sounds funky but basically GPT still recommends a 2018 stack for Android, which is far outdated, even though it knows the 2024 stuff. We see people trying it in interviews, and many end up giving feedback that they wish they didn't use it.


👤 krapp
The thing is, you have to know how to code well enough to know how useful the "code" ChatGPT generates is. And I put code in quotes because it's important to realize that AI doesn't know what "code" is, it just knows what syntactically likely strings are. So it might make up libraries that don't exist or write functions that don't work or don't even follow the semantics of the language, it's just spitting out something that "looks like code."

Much of the time, and for simple enough tasks for which there exists enough examples in the training data, what it produces might be useful. But it can also be wrong in subtle, non-obvious ways. But if you don't know anything about coding and just don't want to pay people for their time and skill, nothing is obvious.


👤 enceladus06
What LLMs help with the most is getting the first half of the project or script outlined. Then you go in and read it, make sure ChatGPT didn't spit out something crazy. But it is super helpfull for writing short scripts and starting projects. But it still does halucinate.

👤 borisk
Give it a try :)

I think it's early days and whyle AI can write code to deliver features it's not perfect. So someone who knows how to write code, deploy software has a basic understanding of software architecture and cyber security will have a lot more luck than someone who has to run everything through the AI.

It's similar to how computers in chess worked. At one point (around the 90s and 00s) computers could play very good chess, but they had weak spots. So the best chess player in the world was a team of humans and computers. But eventually computers got better and no longer needed any human help. I think a similar scenario may play with software development, and we're still not at the place where an AI can everything without technical input from a human.


👤 reyml
definitely, I did not know how to code, but thanks to ChatGPT/Claude i have been able to create several websites with a lot of backend and API manipulation.

i still don't know how many things work, like some functions, and I struggle a lot when the code gets complicated - it starts to lose the overall picture. To address this, i have to start reading the code line by line to understand what's happening.

So far, I have used Node.js with Express in the backend mostly. My advice would be to go ahead and try it, don't be afraid, as it helps a lot. it makes you feel like you can build a lot of complicated stuff, though, I'm having a hard time finding ideas lately


👤 rsynnott
> There's a lot of talk on Twitter how AI means anyone can now easily code and build products and stuff.

There's a lot of talk about all sorts of nonsense on Twitter. No, this isn't true.


👤 nubinetwork
I was able to get an LLM chatbot to write another chatbot, with the idea of giving it filesystem access and the ability to run git. I got half of it working, but I got stumped around function call LLM models, so I kindof gave up.

But a better answer would be, given a large enough context size (32k or higher), it should be reasonable to write code as long as it doesn't try to do something stupid like using the wrong json parser format for a function that doesn't even exist in that library.


👤 therobinhood
Absolutely, AI makes coding accessible to a broader audience. It allows anyone to start coding, but it's like driving a car without fully understanding how the engine works. You might find yourself constantly troubleshooting and revising your code. For example, building a simple website is quite straightforward with AI assistance, but when you dive into more complex, low-level programming, a deeper understanding becomes crucial to avoid major pitfalls.

👤 qarl
Yes.

Last week I wanted to interface with a bluetooth device in Python on a Mac.

I asked ChatGPT for a solution - it told me about a Python-ObjectiveC bridge to the MacOS Bluetooth API. It coded a stub to search for the device, connect to it, query for services, and write data to a specific service.

I've never used bluetooth before. I have minimal experience with Python. With ChatGPT this entire prototype took about 10 minutes to implement.