Has anyone seen a study or even an anecdote on correlation between success of a developer if they started coding earlier (say 11yr), rather than later (say 15yr or 16yr) in their youth?
I'm sure you would find that most people who started programming at a young age (and are still doing it obviously) are probably better than average and also enthusiastic. However, I think it is still a disservice to anyone who didn't have support as a child.
For the record I was learning basic and programming games on my TI-83 calculator in middle school, so I would probably benefit from this question!
That is the main advantage in my case over many others who didn't start so early. Programming concepts are easier to understand, I can learn codebases quicker, and similar. But I can attest to being a very, very bad programmer for about 15 years before I started learning coding seriously. You shouldn't have hired me over a fresh grad when I was 22. Now, maybe I bring more to the table than some other engineers with formal education only. But then again, the advantage is so small that things like personality have a bigger influence on coding ability, I think.
The question about when a candidate started coding can be a nice conversation starter. But starting early doesn't guarantee competency. In fact, a kid will almost definitely not follow great SWE practices for many years. So those years of experience aren't worth a lot, in my opinion.
As others have said, the age at which someone starts programming might be a better predictor for their wealth and similar. I was the only kid in my neighbourhood in the 90s with a personal computer to code on. But what good does that do in describing me today as a professional, a software engineer, or a person?
It's irrelevant.
You might as well ask about the wealth of the candidate's parents, which is significant factor in whether a child has access to a desktop computer on which they can do serious coding.
The signal this question provides is a signal about the person giving the interview (and the company as a whole) and nothing about the candidate.
That is, if someone asked me this question in an interview, I would immediately know that I want nothing to do with the company. I would probably politely end the interview on the spot, and would definitely not answer the question.
What the question is asking is whether the candidate was able to have access to a computer as well as resources on how to develop on that computer when you were growing up.
This is extremely biased because many people may not have been able to have a computer to program on, could not get access to such a computer, or, if they could, did not have access to information on how to program the computer.
The point is, all of those details are irrelevant.
That is, if two candidates have the same number of years of professional experience, it doesn't matter when they first started getting interested in the field they are interviewing for.
If two candidates are equally competent at the coding/system-design questions, then I don't see what difference it makes if one candidate started at 11 and the other at 16. You'll also want to know how much time they spent programming - 1 hour per month? 1 hour per day? did they stop for a few years? Also, what was the actual programming? Because entering BASIC games from a book, as I did in the 1980s, vs. hacking Spice Lisp at CMU (as jwz did at age 16) are both "programming", but I know who you should have hired as an employee, and it wasn't me.
[1] I'll quote https://dl.acm.org/doi/pdf/10.1145/3196839.3196867 as the first paper I found which discusses the underlying issue, and with supporting evidence:
> Computing education has experienced a falling popularity over the last decades, in particular among girls [15, 18, 29]. The situation does not seem to be improving, either in Norway or in other Western countries. We even see a drop in the number of women in computing studies in the humanities that used to attract more women [30].
> In contrast to the Western picture, certain Asian and Eastern European countries have not followed this trend and can boast a higher proportion of women in computer science [8, 50]. This leads us to conclude that there is no obvious, natural or biological reason for women being a minority in computing or for computing being associated with masculinity.
> Such associations are rather results of social structures and specific cultural constructions that have made it less likely for women to choose computing across most of the Western world [8, 52].
> Cultural images and stereotypes have been documented as being vital in creating barriers for girls’ and women's engagement in IT [5, 10], also in Norway [18].
Quite honestly, I think this is an incredibly bad measure to try to correlate with job performance. I know plenty of people who coded from a young age and aren't good at coding, or they are great at coding, but wouldn't make good employees.
Learning to code early did not make me a better anything. I'm sure an adult could learn my entire childhood's worth of tinkering in 6 months to a year. And that is me being quite flattering to my younger self.
Long-term professional experience on large (50k-100k loc and up) projects is where you cut your teeth. That knowledge is hard to learn, hard to teach, and hard to transform into constraints when architecting something. No ten-year-old has those skills, and learning to code young doesn't somehow predispose you to better acquiring them. I'm fairly sure I don't have them yet.
Some people are better at some of these aspects than others, and vice versa. Only experience requires having programmed a lot. However perhaps those who start young are more likely to "get it", making it fun for them, though I think it's a poor proxy since you can't control for who had exposure at an early age.
I started programming at age 12, quite by chance. I stumbled over some code one day, and my buddy could tell me it looked like Basic and I could try QBasic on my machine. I was quite fascinated, and really wanted to learn more. However I didn't know anyone who could program, and we didn't have internet nor BBS access. So until I was 16 and we got our first modem, I spent a lot of time bumbling around alone.
On the bright side I think it made me good at reading documentation. On the other hand, I could probably have done a lot more actual programming if hadn't had to spend a full week trying to find the one keyword I was looking for... Of course it didn't help that QBasic's built-in help was in English, and I had only started learning English that same year.
This is a bad question.
But then I was set back at the age of 18, where I developed severe PTSD from a family abuse related incident, sadly. That was way back in 2001.
my first exposure to programming was college and it was incredibly discouraging for me to discover that many of my peers had been programming for years; i felt like it was hard to 'compete' with my peers simply because my parents didn't encourage me to work with computers when i was a child. ironically, i ended up doing well on exams and getting a minor teaching position because i didn't have self-taught bad habits and could explain the coursework better than those who already 'got' the concepts from prior experience.
Anybody cares to explain their reasoning here about the correlation with wealth? Besides any correlation between having started early and being a good prospective hire.
If starting early actually did help them become better programmers, then it will be apparent from their current skill level and the accomplishments on their resume. Just focus on looking at that.
People from a more rural or economically poorer background would be discriminated against just because they did not have access to computers early in life.
Having earlier access to a computer and 'coding earlier' is no indicator of superior talent or current coding ability.
Just because you wrote a ruby script at 19 doesn't mean you actually started programing then.
I had never used a computer in my life (except for a brief interaction with a TANDY) until I was 18, and I still went on to have a relatively good career as a Linux sysadmin.
Its a relatively small data point, but I do think its milk-able. For example for product managers without experience, finding out whether they have actively thought about how things are built the way they are or tend to be unnaturally frustrated by bad design they encounter in life is a relatively solid indicator of whether they will be succesfull.
Its just one indicator though, your will still get false-positives and false-negatives.
You're filtering for wealth.
The hobbyist who coded from age 10 because they just enjoy solving super complex problems will be helpful for fixing that thorny bug but may not be the right person to work with a Product Manager to break down an ambiguous project and launch something fast.
A team will necessarily contain lots of different kind of people and I think managers should be looking to understand what a candidates strengths will be (and red flags) and ask themselves if this person could achieve the needs of the team.
I like to think I've enjoyed some success over the past 11 years, but I would definitely fail your friend's arbitrary criteria.
There are of course outliers, like competitive programmers who have successful software careers, but you can find them easily by other means.
Any interview question, where the answer cannot be verified fairly quickly, if used as a criteria for hiring, will get you plenty of liars and a few good people.
Also, learning to code is too vague. I learned to code at a very young age but did not formally begin programming until much later in life.
It's also discriminatory. As another commenter put it, you might as well ask how much money their parents earned.
Appreciate the community giving their perspectives, most which are spot on.
The age at which you start doesn’t guarantee anything other than you can say “I’ve been writing code since X”.
So they don't even need to be geniuses, it's just the extra amount of time they had with programming has given them a head start and is an indicator that they love what they're doing, giving a higher probability that they'll be passionate about the job.
Correlation is not causation.
For what it’s worth, I learned to code in grade school.
I wasn’t terrible at my job, and left in that principal role.
Not sure how conclusive my anecdote is, but I 100% do not believe there’s a strong relationship with starting to code earlier in life and being a better coder, engineer, or general employee than someone who started in college or even much later in life.