If you think so, why?
How should it be taught?
Below are a few quotes taken from a 2018 interview with Wirth on the topic:
> "I thought, as I did in the Pascal times, the foundations have been laid when a person learns his first programming language. That’s the key.
In order to do a good experience, you have to have a clean language with a good structure concentrating on the essential concepts and not being snowed in. That is my primary objection to the commercial languages – they’re simply too huge, and they’re so huge that nobody can understand them in their entirety. And it’s not necessary that they’re so huge."
On further development of Oberon:
"The incentive came from colleagues mentioning that they had learned very, very much from my book called Project Oberon, which is one of the very few describe a full system and its development in detail. That’s something in computer science that is wrong. People learn to write before they have learned reading."
Btw, programming should be taught on a model more like apprenticeship, less like classroom/college (or even bootcamp). Low wages when low skill, but guarantees of opportunities to upskill as time goes by, a process monitored by outside experts (i.e. a guild) to make sure the apprentice is being treated fairly.
When I want to learn or teach, I'm most motivated by a specific project with outcomes that interest me. I learn the programming aspect to get what I want, not to learn programming. This is used with kids in learning programing with robotics or Scratch games, but I think it works for adult education, too.
I learned Lua after being a hardcore JS person for many years because I wanted to mod a Steam game that used it. I joined the religion of Typescript from coding handheld games in it, not because I had to. Recently, I've been teaching a new programmer who doesn't work in our field (she's an illustrator). Her motivation has been to have a truly custom online portfolio, which led us to start her programing journey with web components and TS.
Seeing results you care about is the best way to learn programming, I think.
Instead think of learning as something you do for yourself. You will probably need other people to help — teachers, mentors, etc. You need motivation, interest, curiosity, and lots of deliberate practice.
However, I just wish this industry was more willing to engage in on the job training. I think this could work people with and without programming experience.
Though perhaps those without experience would take too much time to learn and/or might end up being uninterested, thus being a huge cost/time sink for a company despite being a good investment in the beginning. I will say, it's hard to be a good programmer without some sort of passion for the craft. The best programmers I have known have usually all been people that genuinely enjoyed programming. I throughly enjoy it myself, but I am so burned out that, my passion is almost gone, and I seriously think it is starting to hinder my advancements.
I am just annoyed with the fact that so many places turn down talented people, not for their inability to learn, but for the sheer fact of not knowing As for programming education though, I do not think there are any issues to programming that cannot be extrapolated to education as a whole. Programming is just using a keyboard to type memorized syntax into the magic electrical box. Instead of focusing on programming, I think it'd be better served to focus on what programming relies on -- logic, critical thinking, deductive reasoning, troubleshooting skills, and perhaps a few others. However, our current system tries to teach my previous and imperfect list of required skills while teaching programming languages and their respective syntax. I bet you could education some damn fine "programmers" without them ever touching a computer.
Sure, plenty of people struggle a lot with the first skill. But it's almost entirely pointless to teach the first skill to someone who lacks the second.
It's as if you taught a pet how to indicate letters in sequence from among a pile of cards, found that they could spell no meaningful sentences, yet declared "ah, I have been successful, and to compound on my success I will teach them to do this with the Greek alphabet too!"
Thus we get legion programmers who can tell a computer to model a chessboard with `class WhiteRook extends Piece`, but fail to understand that this is a bad solution and bitboards are better. Or put in your favorite example.
This point comes up when we discuss the low relevance of specific language skills to hiring "senior developers" (this being how we refer to people who have gained the second skill). The usual story told to explain this phenomenon is that senior developers have mastered the first skill to such an extent that they can pick up any language quickly. There's an element of truth to that, but I think it is more accurate to say that the first skill is simply not very useful in the day to day job of a senior developer. Someone who has the second skill but lacks the first is simply better at the job than vice versa.
On the other end of the spectrum, I encounter this phenomenon in a forum for helping beginner programmers: time and again a beginner will ask "how do I frobulate the encabulator? I tried encabulator.frobulate()" - but when queried, they cannot explain in any terms why they are attempting to do such a thing. Upon further inspection it turns out that encabulators should not in fact be frobulated under any circumstances. But because the beginner only had the skill of figuring out how to do such a thing, it simply never occurred to them to figure out whether it should be done! Again, it's not merely that they were wrong about it, but rather, lacking the skill to make such judgment, the question itself is missing entirely from the repertoire of tools they have at hand for doing their work. Their problem-solving process is "1. Read the problem 3. Tell a computer to output the answer" but this is missing the most critical, maybe the only critical step: "2. Find a solution". This is instead treated as a minor interior part of step 3, which is a woefully ineffective process.
I'll throw out one more concrete example, although I know I'll lose some readers who will insist this is actually a reasonable thing to do: it is a perennial question, "how do I get $0.20 + $0.10 not to output $0.300000004" (or whatever it is). The correct answer is "don't use floating point math for currency", but the beginner gets so focused on that '.' in the input and the question "how do I make that '.' happen in my programming language?" that they fail to ask better questions like "what does that '.' mean?", "what is a floating point number?", "are they the same thing?", "What am I actually trying to accomplish?"