In which you, more or less, build a computer from scratch. The course takes you through 12 projects, about 1 week each, where you incrementally build:
a CPU
a RAM chip
a full von Neumann computer
an assembly language
a virtual machine
a high-level language
an operating system
... using NAND gates. All of this is done on your computer using tools provided by the course. Once you've done these projects you will understand the building blocks of a computer from the RAM and CPU, to assembly up to the compiler that executes your programming language of choice. It's a powerful course that will unlock a whole new perspective on computer programming for you. I believe that bang-for-buck it's probably the best online course for someone who is a self-taught programmer. It's practical, fun and mostly oriented around building things.
(from my blog @ https://mattsegal.dev/nand-to-tetris.html)
It is not too math heavy, and the focus is on basic, interpretable approaches and concepts like:
- linear and polynomial regression
- logistic regression and linear discriminant analysis
- cross-validation and the bootstrap
- model selection and regularization methods (ridge and lasso)
- tree-based methods, random forests and boosting
- support-vector machines
- neural networks and deep learning
- survival models; multiple testing
- some unsupervised learning methods like principal components and clustering (k-means and hierarchical).
The instructors are really articulate and passionate about teaching well. As a bonus, there are guest speakers about every second week including Jerome Friedman and Geoff Hinton.
https://www.coursera.org/learn/basic-modeling
https://www.coursera.org/specializations/algorithms
https://www.coursera.org/learn/discrete-optimization
https://www.coursera.org/learn/programming-languages
I'm still working my way through this one but it's been good so far:
https://www.coursera.org/specializations/data-structures-alg...
Discrete Optimization https://www.coursera.org/learn/discrete-optimization
You can get a lot from the David Tong lectures if you want something similar but free, but to me, Szekeres wins out on delivery.
* CS224n natural language processing
* CS330 Meta Learning
* CS224U Natural language understanding
* CS234 Reinforcement Learning
* CS221 Artificial intelligence
* CS229 theoretical machine learning
The course notes can be found at CS(x).Stanford.edu such as
cs229.stanford.edu
Of course the main problem isn't that there's not enough good material. The problem is that there's too much! So a course that teaches how to pick the most important courses for oneself is sorely needed
It has challenging exercises. And they gave access to their automatic exercise checker.
1. How to Design Programs: https://htdp.org/
2. A Data-Centric Introduction to Computing: https://dcic-world.org/
[0]: https://news.ycombinator.com/item?id=25245125
[1]: https://news.ycombinator.com/item?id=16745042
[2]: https://news.ycombinator.com/item?id=22826722
There are some evergreen ones there.
You will find some courses that are among the best places to get knowledge in those topics.
It's a nice course on a current language in the Smalltalk lineage, that even more than 40 years after its introduction still provides mostly unsurpassed programming environments and developer experience.
Stanford CS224W, https://web.stanford.edu/class/cs224w/ Zak Jost's Intro To GNNs, https://www.graphneuralnets.com/