What are the biggest misconceptions you've given up about computing?
What are the biggest misconceptions you've given up about computing?
- "Iterating over all of 32-bit space will take forever"
- "Applications run on top of operating systems, i.e. their code is executed by operating systems"
- "Text is a good API"
Not me, but one of my students thought that the result of the functions were magically cached by defaults, so if you call f(10) and call it again later, the computer remembers the value and it does not affect the run time. It change the complexity of the algorithm if f(10) is inside a loop and 10 is actually a variable.
That if you do the same thing you'll get the same results.
there are no tiny cute little fairy creatures carrying out calculations on little abacuses. in fact, it all really looks a lot like plumbing.
The Singularity will never arrive
for example: "Lisp is an AI language"