However recently I've become very aware that JS/TS and Swift etc are just APIs on top of APIs. I've been drawn to learning more about how computers work, the history of programming/computers (Unix, Sinclair, commodore, etc and even going back to Ada Lovelace, Babbage and mainframes in the 1950s) and things like memory allocation. I've tried learning some BASIC and Assembly code but haven't really got very far. I read/devour articles on sites like https://twobithistory.org but they only get you so far.
What can I do to help accelerate this and satiate this desire to learn more about how computers work? I live in London, UK and would be happy to spend some money on a uni course or something if there was a good one. I learn best practically so like to be "doing" something as well as theory.
https://www.youtube.com/playlist?list=PLowKtXNTBypGqImE405J2...
I recommended it to my daughter when she was taking a class in R and asked "But how does the COMPUTER know what to do?"
Explains how we got from Boolean logic to microchips and software.
Also, the Computer History Museum in Silicon Valley has an excellent exhibit containing both early computing devices and the seminal papers that were the precursors and enablers of modern computers: https://www.computerhistory.org/revolution/; https://computerhistory.org/timelines/;
https://kpolsson.com/comphist/
Apparently the author has been maintaining that timeline since 1995 and is still doing it!
While it doesn't cover things like computer science, I think it's an excellent jumping off point for learning about notable people and events.
Not exactly what you asked for, but you may also be interested and may give you some insight I think more programmers should have.
EDIT: Also, don't stop at Babbage & Lovelace. Although Babbage's analytical engine was one of the first, if not the first programmable computers with a form of memory, there were people working on extremely primitive computers (or rather advanced calculators) way before Babbage. Schickard, Pascal, and Leibniz conceived of and developed calculating engines that did basic math with support for interim value storage, which one might consider to be the earliest form of computer memory.
https://www.google.com/search?q=oral+history+computer+museum...
To find more (and there are many great ones outside of the Museum), you can try a broader search:
https://www.google.com/search?q=oral+history+arpa+filetype%3...
I have found that I can read around 3-5x faster than listening to people talk, depending on the speed of the speaker (most of the people interviewed in these oral histories are quite old and can speak a bit slower), and I also retain the information much better. There is something about reading an actual conversation by someone who was there when this stuff was being invented (or literally invented it themselves) that you don't get from reading a retrospective historical account, and it makes the information stick with you more, since it's all framed in stories and personal accounts.
Some favorites:
https://conservancy.umn.edu/bitstream/handle/11299/107503/oh...
https://archive.computerhistory.org/resources/access/text/20...
https://conservancy.umn.edu/bitstream/handle/11299/107247/oh...
https://archive.computerhistory.org/resources/text/Oral_Hist...
http://archive.computerhistory.org/resources/text/Oral_Histo...
https://conservancy.umn.edu/bitstream/handle/11299/107613/oh...
https://digitalassets.lib.berkeley.edu/roho/ucb/text/valenti...
https://conservancy.umn.edu/bitstream/handle/11299/107642/oh...
There are so many other good ones, but that's a good start!
Another good approach one can take to learn is starting with a simple system with well-defined rules, and making a simple computer out of it. Many people do this in minecraft, for myself it was boolean functions in excel. You can and should look many things up during this process, fail and rework designs several times etc. Learning how logic gates work, then scaling the knowledge up to bit adders, registers, ALU, making a cpu instruction set and starting on basic turing machine architecture is a very rewarding hobby and is definately the best way to get low-level knowledge
EDIT: Per Amazon there's a second edition of Code coming out at some point, but no date that I've been able to find.
I've also got a copy of, but not yet read, Ideas That Created the Future: Classic Papers of Computer Science edited by Harry R. Lewis, the contents are in chronological order with the most recent in 1979. It has 46 different papers on computing, being largely historical this ought to be a decent starting point as well.
https://www.tracykidder.com/the-soul-of-a-new-machine.html
Another comment mentioned “Pirates of Silicon Valley” as a good dramatization of MS/Apple and there’s also the miniseries “Valley of the Boom” about the rise and fall of Netscape and “Halt and Catch Fire” which is a fictional and thematic view of 80s/90s computer history.
You might be familiar with Feynman's Lectures on Physics, but his lectures on Computation (based on a class he taught and his work in 'Connection Machine') aren't any less amazing. Through this short book, Feynman guides us through the concept of computation and the van Neumann architecture in his unique style, from logic functions, to Turing machines, coding and even quantum computers. It will give you a unique appreciation of the finer points in which computers are "Dumb as hell but go like mad" so that you can better squeeze every bit of performance out of your code.
History-wise, enjoy learning slowly because there's so much that even if you dedicated yourself to it you wouldn't be "done" any time soon! Some suggestions in order though:
Watching The Mother of All Demos: https://www.youtube.com/watch?v=yJDv-zdhzMY
A short clip of Sketchpad presented by Alan Kay: https://www.youtube.com/watch?v=495nCzxM9PI
An article from the 40s that also inspired Engelbart: https://www.theatlantic.com/magazine/archive/1945/07/as-we-m...
The Information by James Gleick
What the Dormouse Said by John Markoff
The Psychology of Computer Programming by Gerald Weinberg
Lastly, to mix up in whatever order you please, some paper collections:
Object-Oriented Programming: The CLOS Perspective edited by Andreas Paepcke
History of Programming Languages papers for various langs you're interested in, here's the set from the second conference in 1993 https://dl.acm.org/doi/proceedings/10.1145/154766 but there have been further conferences to check out too if it's interesting
Also all of the Turing Award winners' lectures I've read have been good https://amturing.acm.org/lectures.cfm
All that and some good recommendations others have given should keep you busy for a while!
In terms of "how computers work" I agree with others who recommended Elements of Computing Systems (aka nand2tetris).
I highly recommend reading "The Dream Machine" by Mitchell Waldrop. It's very well written, and covers a huge swath of computing history, from the ENIAC to the Internet (it was written in 2000).
Instead of recommending specific sources (too many), I can mention key milestones in computing history that you may want to research:
- Theory of computation (Alan Turing, Alonzo Church)
- Early binary systems (John Atanasoff, Konrad Zuse, George Stibitz, Claude Shannon)
- Early computers (ABC, ENIAC, EDSAC, EDVAC, Von Neumann architecture)
- Early programming (Assembly language, David Wheeler, Nathaniel Rochester)
- Early interactive computing (MIT Whirlwind, SAGE, TX-0, TX-2)
- Early mainframes (UNIVAC, IBM 70x series)
- Early programming languages (Speedcoding, Autocode, A-0, A-2, MATH-MATIC, FLOW-MATIC)
- First programming languages (FORTRAN, COBOL, LISP, ALGOL)
- Early operating systems (GM-NAA I/O, BESYS, SOS, IBSYS, FMS)
- Early time-sharing system (MIT CTSS, Multics, DTSS, Berkeley TSS, IBM CP-67)
- Early Virtual Memory (Atlas, Burroughs MCP)
- Early minicomputers (DEC PDP line)
- Mainframe operating systems (IBM OS/360, UNIVAC EXEC)
- Early online transaction processing (SABRE, IBM ACP/TPF)
- Early work on concurrency (Edsger Dijkstra, C.A.R. Hoare, Per Birch Hansen)
- Early database systems (GE IDS, IBM IMS, CODASYL)
- Early Object-Oriented Programming (Simula I, Simula 67, Smalltalk)
- More programming languages (CPL, BCPL, B, C, BASIC, PL/I)
- Mini/Supermini operating systems (Tenex, TOPS-20, VMS)
- Structured Programming (Pascal, Modula, Niklaus Wirth)
- Relational data model and SQL (Codd, Chamberlin, Boyce)
I could keep going on, but this is already too long. I hope this at least puts your feet on the first steps.
http://www.catb.org/~esr/writings/cathedral-bazaar/
and other things from esr at
including the aforementioned jargon file. Here’s one I hadn’t stumbled on before, ‘Things Every Hacker Once Knew’
http://www.catb.org/~esr/faqs/things-every-hacker-once-knew/
For an activity ymmv depending on how much time you can spend; an alternative to building a computer from scratch, or an OS from scratch, is to buy a vintage cheapie running cp/m or dos, something where the OS isn’t abstracting memory management for you. Growing up in the 80s, I think managing my own memory and _everything_ that implies was the greatest teacher.
https://en.wikipedia.org/wiki/The_Machine_That_Changed_the_W...
is out of print, but can be found intermittently on youtube.
I love the coverage of 1940's computing, with interviews with several of the surviving people:
https://en.wikipedia.org/wiki/Konrad_Zuse
https://en.wikipedia.org/wiki/ENIAC
https://en.wikipedia.org/wiki/Eckert%E2%80%93Mauchly_Compute...
https://en.wikipedia.org/wiki/EDSAC
Currently working episode links:
1: https://www.youtube.com/watch?v=hayi9AsDXDo
2: https://www.youtube.com/watch?v=GropWVbj9wA
3: https://www.youtube.com/watch?v=rTLgAI3G_rs
https://adventofcomputing.com/
It's also fairly entertaining
- The Story of Math(s) by Marcus du Sautoy to set the stage... school and taxes in ancient Sumeria, Fibonacci bringing Indian numbers to Europe, and other fascinating subjects.
- We watched short biographies of Babbage and Lovelace, full-length ones of Turing and Von Neumann. The "code breakers" of WWII.
- Top Secret Rosies: The Female "Computers" of WWII, another good one.
- There's more history in PBS' Crash Course Computer science, than you might expect. It is great although so peppy we had to watch at .9x with newpipe. Shows relays, vacuum tubes, to ICs, to the Raspberry Pi. As well as the logic gates they model.
- "The Professor" at Computerphile is a great story teller about the early days.
- There are great videos about CTSS being developed at MIT I think, where they are designing an operating system via paper terminal and trying to decide on how to partition the memory/storage: https://www.youtube.com/watch?v=Q07PhW5sCEk
- The Introducing Unix videos by ATT are straight from the source: https://www.youtube.com/watch?v=tc4ROCJYbm0
- The movie/book "Hidden Figures" touches on this time as well. Facing obsolescence by IBM, one of the characters teaches herself Fortran.
- The Pirates of Silicon Valley is a fun dramatization of the late 70s to 80s PC industry. It specifically calls out the meeting between MS and IBM as the deal of the century. We also watched a "Berkeley in '68" doc on Kanopy to set the stage before this one. Interesting, but a tangent.
- The "8-bit Guy" is also great, he dissects and rebuilds old home computer hardware from the same era, and teaches their history as he does it. Even his tangential videos on why there are no more electronics stores (besides Apple) in malls is great.
- There are good docs on the "dead ends" of the industry as well, such as "General Magic" and "Silicon Cowboys."
- "Revolution OS" a doc about the beginnings of FLOSS and Linux.
"Understanding Digital Computers : A Self-learning Programmed Text That Will Teach You the Basics for the Microcomputer Revolution" by Forrest M. Mims III.
It's dated, but the core material is still relevant. Even the dated sections might suit you if you're interested in the history.
In a similar same vein, a few years ago I wrote a course which starts with the idea of a bit and ends with the student programming a computer that they built themselves in a logic simulator.
https://john.daltons.info/teaching/engineering/
The first few lessons meander, as I was still figuring out a direction, so the meat starts at lesson 3. The last lessons are missing (roundtoit), but if there is interest I can put them on-line. From memory all the examples are on-line. Here is the final computer:
https://john.daltons.info/teaching/engineering/simcirjs/comp...
In this example a program is already in memory, so just push "run/stop" to make it run. The instruction set isn't on-line, as it's in the later lessons, which I haven't gotten around to uploading.
A 1 hour journey leads you to the The Center for Computing History in Cambridge [1]. Please go there and see for yourself and interact with the history of computing. You may also buy one of the maker kits to get started with [2].
I too have a similar keen interest and there are some fantastic volunteering opportunities to deep dive and learn about the history. [3]
And there was this awesome Gaming Generations Exhibition that just got over last week [4].
You could combine all this with other equally fantastic solutions proposed here (Ben eater's videos, Nand2Tetris) etc.
That hopefully makes for a fun, interactive way of satiating your good hunger :-)
[1] http://www.computinghistory.org.uk
[2]http://www.computinghistory.org.uk/det/50229/MyZ80-Maker-Kit...
[3]http://www.computinghistory.org.uk/pages/14522/volunteering/
[4]http://www.computinghistory.org.uk/det/66270/gaming-generati...
The history of computing is replete with really dumb ideas, from addition and multiplication tables in memory (IBM 1620) to processors optimized for Ada that ran too slowly to be useful (Intel iAPX 432). There were really smart ideas, too, such as cache (IBM System/360 Model 85) and RISC (too many systems to mention). What you want is just the smart ideas, I'd say.
If you want to get an understanding of how modern computers work, and given your CS degree, I would recommend David Patterson/John Hennessy's Computer Organization and Design, any edition. A lot of universities use this book in a second-year architecture course.
In terms of relating this information to the overall hierarchy of computer systems, I would also recommend Nisan and Schocken's Elements of Computing Systems.
It tells the history of computing by following J.C.R Licklider. As one of the directors of ARPA, he was responsible for funding research labs to work on computer research. He had a major impact on which projects got funded, and in-turn which systems are now being used 60 years later. I honestly love this book so much. If you love computers and history, its a must-read.
1. The Annotated Turing.
2. A History of Modern Computing 3ed
3. The ACM Turing award lectures
4. Theory of computation - Dexter Kozen
5. Coders at Work
6. Hackers: Heroes of the computer revolution
Additionally, you could subscribe to Communications of ACM, which is a computing oriented monthly magazine.
These two books helped me much already:
Programming from the Ground Up by Jonathan Bartlett - A very good introduction to assembly
Learning Computer Architecture with Raspberry Pi by Eben Upton - Great read about the inner working of memory and the CPU, with reference to the past and how things developed
- Turing’s Cathedral, by George Dyson
- Black Software, by Charlton McIlwain
- Programmed Inequality, by Mar Hicks
The Dyson book is a rigorous and deep historical dive into the philosophical and practical origins of digital computing, and is really great.
The other two are equally great and deep but cover computing history through different lenses. The Hicks book in particular may be of interest for you, as its emphasis is on the history of computing in the UK. They’re less directly about how computers “work”, as such, and more about how computers and society have interacted with one another in interesting and non-obvious ways, and how those interactions have impacted the ways in which technologies have developed.
It's also worth looking at: https://www.youtube.com/user/yoshikiohshima. There's a goldmine of talks by people like Alan Kay and Seymour Papert. An important question to ask when, "probing" the literature-- why are computers the way they are in terms of human-computer interaction and human culture? What is a, "computer" without making an appeal to mathematical concepts like Turing Machines/Lambda Calculus? What are the major, "paradigm shifts" that gave us GUIs, mice, ect...?
It's worth noting that the history of popular computers parallels almost exactly the neoliberal economic period. Atari was founded in 1972. Look into the Mansfield Amendment and ARPA. Try to get past a cultural myth that computer companies started in, "normal" people's garages. Try to see past the, "present concept." Alan Kay has famously said, "The computer revolution hasn't happened yet." It's up to the current/future generations to, "really" define what computers are in terms of human culture. Think, "living history." Think, "world before-after the invention of the Gutenberg printing press."
https://www.nsf.gov/nsb/documents/2000/nsb00215/nsb50/1970/m...
https://en.wikipedia.org/wiki/Douglas_Engelbart
https://www.theatlantic.com/magazine/archive/1945/07/as-we-m...
* understand brainfuck or so called RAM machine as simplest computer
* read 50 pages of https://en.m.wikipedia.org/wiki/Code:_The_Hidden_Language_of_Computer_Hardware_and_Software
* read https://www.bottomupcs.com/ to understand low level stuff
* Learn some C
To understand computation I think scheme or lambda calculus is the best. Don’t know good intro.Bear in mind that what we have is just certain implementation/abstraction for computation that’s likely still suboptimal. That’s why people come with new languages/VMs. I wonder if some alternative to RAM machine exists. I’ve heard about lisp machines…
Here is an example link from a recent session on the history of Ethernet networking standard: [ Ethernet’s Emergence from Xerox PARC: 1975-1980 ] https://www.youtube.com/watch?v=SVEcqZnGya0
- ECE 190 and ECE 290 covered basic programming, logic gates, and the basics of software processor architecture.
- ECE 391 (one of the hardest courses in the school) covered x86 assembly and operating system design. The capstone project for the course was to build a simple OS with terminal input.
- ECE 411 covered processor architecture in detail, and how a modern x86 processor is built.
There should be courses from other universities that cover the same topics. Here's some similar courses I found on MIT's OpenCourseware platform.
- Computation Structures covers logic gates and other standard electronic constructs. https://ocw.mit.edu/courses/6-004-computation-structures-spr...
- Operating Systems Engineering covers fundamentals of operating system design: https://ocw.mit.edu/courses/6-828-operating-system-engineeri...
Best of luck!
I just finished implementing a really basic network stack for my x86 kernel, including a (crappy) driver for the RTL 8139 network card. I just learned a ton about how the internet works. I learned it all in college, but there’s something different about grappling with it directly.
And I’ve gotten pretty good at C in the meantime. I’ve also learned a ton about virtual memory, page tables, system calls, various hardware, how data is stored on disk, the ELF file format, how processes are loaded and executed, the x86 calling convention, a little bit of assembly code, just to name a few.
Check out https://wiki.osdev.org for where to start. I’m hoping to start writing some blog posts about all of this soon, to provide a resource to complement the osdev wiki. A lot of info on this is surprisingly hard to dig up.
You might want to look into how the idea of computation came out of mathematical work in the early twentieth century. The Annotated Turing by Charles Petzold is good if you're up for some maths.
Aerospace and spaceflight were some of the first activities that required large-scale software development. You could check-out Starburst and Luminary by Don Eyles and Digital Apollo by David Mindell.
[1] The author's father was Freman Dyson who was at the Institute for Advanced Study (at Princeton) with Einstein, Gödel and others.
Website: https://adventofcomputing.com/ RSS: https://adventofcomputing.libsyn.com/rss
If you want an idea of how computers work, there are toy virtual machines that are a good teaching tool (https://peterhigginson.co.uk/RISC/).
My very first introduction to anything "old" tech was through the TechStuff podcast[0] (re: 2011-era episodes, so sort by oldest).
More recently the On The Metal podcast[1] has been a really cool deep dive into old tech history, especially the episode (season 2) with John Graham Cumming.
About implementations, my first real playing around with assembly was "Learn TI-83 Plus Assembly in 28 days"[2].
[0]: https://player.fm/series/techstuff-2152808
From September 2021:
A new history of modern computing
https://mitpress.mit.edu/books/new-history-modern-computing
Skip around it’s various chapters, it’s fun of little details.
Also a fun read, this old article about the silicon in Silicon Valley, it’s from a long dead magazine and it’s titled They Would be Gods:
https://www.dropbox.com/s/l9mi2aqnyf5fp3l/They%20Would%20Be%...
Lastly, the part I enjoyed the most of Walter Isaacsons bio on Jobs, was the adjacent history.
It covers the rise of the PC up until the early 90s and has interviews of everybody including Bill Gates Steve Jobs Larry Ellison etc. etc.… It’s pretty amazing.
2. Check out a recent computing history book like: Thomas Haigh and Paul E. Ceruzzi (2021) A New History of Modern Computing, Cambridge, MA, USA: MIT Press.
Classes to look into. An intro course in microcontrollers would be a good place to start. Usually you will find them attached to the Electrical Engineering department. Maybe take a course in Circuits or Computer Architecture.
[0] - www.nandgame.com
https://en.wikipedia.org/wiki/Computer_Lib/Dream_Machines
Fascinating show, mostly about micros but featuring early industry legend Gary Kildall.
The software reviews are hilarious. The predictions of the future of computing always wrong. The guests demoing stuff always cut off as soon as it gets interesting.
I started programming as a teenager in that era but never saw the show in period. For me its eye opening just how amateur the industry really was. The show is unintentionally funny now, but really gives a great idea of the time period.
https://www.goodreads.com/book/show/191355.Darwin_Among_The_...
- Innovators by Walter Isaacson
- Code by Charles Petzold
- The Annotated Turing by Charles Petzold
- Where The Wizards Stay Up Late by Katie Hafner
- The Information by James Gleick
- Programming Languages (and then Compilers, my favorite)
- Algorithms
- Operating Systems
At a decent school with some level of difficulty, you'll learn the big picture while doing fun projects for homework, along with history.
Programming is a craft, not a science, but it overlaps with math in a lot of places.
This podcast is everything
I'd call them abstractions. This is how it's - high level programming languages (C#) are just abstractions so that we don't have to remember machine instructions. Similar to a name in Contacts connecting to a phone number; the former is far easy to remember.
I used NATO's conferences in 1968 and 1969 on "The Software Problem" as my inspiration.
Now that the ACM digital library is available without subscription, that would be a good resource of their publications.
Or try to find a retro computer, e.g. a BBC micro and start programming it for fun?
I enjoyed this book because every chapter includes hands-on hardware and software experiments for you to see the concept described in action.
Computer: A History of the Information Machine by Martin Campbell-Kelly, William Aspray
But this is a first of all a history book. But nonetheless I learned a lot of things!
about 2 years ago there was a similar thread here @ HN
* https://news.ycombinator.com/item?id=22907211
awesome-computer history
* https://github.com/watson/awesome-computer-history
br v
- Dealers of Lightning is a wonderful book covering Xerox PARC's history and contributions. If you don't know what Xerox PARC is, then you should definitely read it.
- Where Wizards Stay Up Late is a highly-readable, engaging book covering the history of the Internet's early development.
- Soul of a New Machine gives a compelling glimpse into the era when physical machines and unique architectures were more dominant than software in shaping the market.
- The Jargon File as maintained by Eric Raymond is not without controversy, but I think it's still fair to say a lot of computing folklore and cultural history is preserved there. http://www.catb.org/jargon/html/index.html
- Folklore.org is a wonderful, wistful collection of stories from the early days of Apple Computer, as told by some of the engineers and programmers who made it what it was in the 80s and early 90s.
- The Thrilling Adventures of Lovelace And Babbage is a wonderful graphic novel that's full of utterly ridiculous fiction that's only loosely inspired by the title characters. However, it is jam-packed with footnotes about the actual history from top to bottom, and in my opinion, there probably isn't a better or more fascinating glimpse of the proto-history of the computer anywhere.
- Douglas Engelbart's Mother Of All Demos is well worth watching (can be found on YouTube), and maybe reading some commentary on. Mind-blowing what his team put together that we still haven't really matched in some ways.
- Vannevar Bush's piece "As We May Think" isn't really about computers, but it's hard not to connect it to them when you read it. And then maybe to sigh and wonder how someone who didn't have any machine like what he describes can have a vision more compelling than what we've actually managed to build, so many decades before it happened.
- If you're interested in hypertext, look into Ted Nelson. None of his work ever really took off, and Project Xanadu was a legendary mishandled undertaking, but his vision for what might have been is fascinating, and influenced many of the software pioneers, as I understand it.
- This glorious video of using a 1930s teletype as a command-line Linux terminal taught me a surprising amount about why the classic Unix tools work as they do. https://www.youtube.com/watch?v=2XLZ4Z8LpEE
Enjoy!
https://www.amazon.com/Colossus-secrets-Bletchley-code-break...
It was used by the British to break the Lorenz cipher (which the Nazis used to encrypt high-level strategic communications.)
Learn about the CDC 6600 and the architecture compared with top IBM 360 computers. Learn about Cray I.
Cray was the builder of the fastest computers in the world for a long time. Always sold early machines of each model to scientific govt labs and NSA. The first 6600s were delivered to Livermore and Los Alamos (Wikipedia CDC 6600).
With less than 30 people Cray built a computer (the 6600) about 2 times faster than anything IBM with its massive budget could build. There is a famous letter by IBM's Chief Watson Jr. about this fact at The Computer History Museum website.
When IBM used ICs for the 360's, Cray still used transistors for the CDC 6600. His reasoning is great.