HACKER Q&A
📣 Tmkly

Best way to learn about computing history?


I'm a software engineer, mainly working on mobile apps (iOS primarily) through React Native and some Swift/Java. I have a CS degree and about 7 years in this field.

However recently I've become very aware that JS/TS and Swift etc are just APIs on top of APIs. I've been drawn to learning more about how computers work, the history of programming/computers (Unix, Sinclair, commodore, etc and even going back to Ada Lovelace, Babbage and mainframes in the 1950s) and things like memory allocation. I've tried learning some BASIC and Assembly code but haven't really got very far. I read/devour articles on sites like https://twobithistory.org but they only get you so far.

What can I do to help accelerate this and satiate this desire to learn more about how computers work? I live in London, UK and would be happy to spend some money on a uni course or something if there was a good one. I learn best practically so like to be "doing" something as well as theory.


  👤 robotguy Accepted Answer ✓
Ben Eater's Youtube series "Building an 8-bit Breadboard Computer" is a really good introduction to the lowest levels of how a computer works:

https://www.youtube.com/playlist?list=PLowKtXNTBypGqImE405J2...

I recommended it to my daughter when she was taking a class in R and asked "But how does the COMPUTER know what to do?"


👤 SkyMarshal
Code, by Charles Petzold: http://www.charlespetzold.com/code/

Explains how we got from Boolean logic to microchips and software.

Also, the Computer History Museum in Silicon Valley has an excellent exhibit containing both early computing devices and the seminal papers that were the precursors and enablers of modern computers: https://www.computerhistory.org/revolution/; https://computerhistory.org/timelines/;


👤 Phithagoras
From a NAND gate to Tetris was excellent. Informative and obvious

https://www.nand2tetris.org/


👤 ravenstine
Though it doesn't cover all of computing history, this site is a comprehensive timeline of personal computing history from 1947 to now.

https://kpolsson.com/comphist/

Apparently the author has been maintaining that timeline since 1995 and is still doing it!

While it doesn't cover things like computer science, I think it's an excellent jumping off point for learning about notable people and events.

Not exactly what you asked for, but you may also be interested and may give you some insight I think more programmers should have.

EDIT: Also, don't stop at Babbage & Lovelace. Although Babbage's analytical engine was one of the first, if not the first programmable computers with a form of memory, there were people working on extremely primitive computers (or rather advanced calculators) way before Babbage. Schickard, Pascal, and Leibniz conceived of and developed calculating engines that did basic math with support for interim value storage, which one might consider to be the earliest form of computer memory.


👤 themadturk
Steven Levy's Hackers is a foundational work of the history of computing. Levy spends a lot of time on the MIT hackers of the 1960s and 1970s, the group that hatched Lisp, Richard Stallman and the free software movement, and also a lot of time on the Bay Area hackers that kick-started the microcomputer revolution. Certainly it's not a comprehensive guide to the full range of computing history, but it's an important and engaging look the beginnings of where we are today.

👤 eigenvalue
I think the best way to learn this stuff is from the people who did it, speaking in their own words. But watching videos takes forever, so the best way to do this is to read oral histories. The Computer History Museum has really great content-- I've read dozens of these. You can easily find them, ranking in approximate order of popularity, with the following google search:

https://www.google.com/search?q=oral+history+computer+museum...

To find more (and there are many great ones outside of the Museum), you can try a broader search:

https://www.google.com/search?q=oral+history+arpa+filetype%3...

I have found that I can read around 3-5x faster than listening to people talk, depending on the speed of the speaker (most of the people interviewed in these oral histories are quite old and can speak a bit slower), and I also retain the information much better. There is something about reading an actual conversation by someone who was there when this stuff was being invented (or literally invented it themselves) that you don't get from reading a retrospective historical account, and it makes the information stick with you more, since it's all framed in stories and personal accounts.

Some favorites:

https://conservancy.umn.edu/bitstream/handle/11299/107503/oh...

https://archive.computerhistory.org/resources/access/text/20...

https://conservancy.umn.edu/bitstream/handle/11299/107247/oh...

https://archive.computerhistory.org/resources/text/Oral_Hist...

http://archive.computerhistory.org/resources/text/Oral_Histo...

https://conservancy.umn.edu/bitstream/handle/11299/107613/oh...

https://digitalassets.lib.berkeley.edu/roho/ucb/text/valenti...

https://conservancy.umn.edu/bitstream/handle/11299/107642/oh...

There are so many other good ones, but that's a good start!


👤 RugnirViking
There are also several good museums dedicated to the subject. I used to work for the national museum of computing at bletchley park in the UK, and there they have a lot of good exhibits that teach basics of how computers and networking works and has evolved over the years.

Another good approach one can take to learn is starting with a simple system with well-defined rules, and making a simple computer out of it. Many people do this in minecraft, for myself it was boolean functions in excel. You can and should look many things up during this process, fail and rework designs several times etc. Learning how logic gates work, then scaling the knowledge up to bit adders, registers, ALU, making a cpu instruction set and starting on basic turing machine architecture is a very rewarding hobby and is definately the best way to get low-level knowledge


👤 Jtsummers
So your title and comment suggest two slightly different things. For "how computers work?" I recommend Code by Petzold (higher level, good book) and The Elements of Computing Systems by Nisan and Schocken (also available here: https://www.nand2tetris.org/). The latter is project based and has you develop a computer starting at NAND gates and working up. It can be run through at a good clip while still learning a lot if you're a moderately experienced developer.

EDIT: Per Amazon there's a second edition of Code coming out at some point, but no date that I've been able to find.

I've also got a copy of, but not yet read, Ideas That Created the Future: Classic Papers of Computer Science edited by Harry R. Lewis, the contents are in chronological order with the most recent in 1979. It has 46 different papers on computing, being largely historical this ought to be a decent starting point as well.


👤 ecliptik
I haven’t read it, but heard good things about “The Soul of a New Machine” about Data Generals efforts create a new 32-bit superminicomputer.

https://www.tracykidder.com/the-soul-of-a-new-machine.html

Another comment mentioned “Pirates of Silicon Valley” as a good dramatization of MS/Apple and there’s also the miniseries “Valley of the Boom” about the rise and fall of Netscape and “Halt and Catch Fire” which is a fictional and thematic view of 80s/90s computer history.


👤 Anon84
Feynman’s Lectures on Computation: https://www.amazon.com/gp/product/B07FJ6RRK7/ref=as_li_tl?ie...

You might be familiar with Feynman's Lectures on Physics, but his lectures on Computation (based on a class he taught and his work in 'Connection Machine') aren't any less amazing. Through this short book, Feynman guides us through the concept of computation and the van Neumann architecture in his unique style, from logic functions, to Turing machines, coding and even quantum computers. It will give you a unique appreciation of the finer points in which computers are "Dumb as hell but go like mad" so that you can better squeeze every bit of performance out of your code.


👤 Jach
For "how things work", I recommend the book Code by Charles Petzold. After that, Jon Stokes's Inside the Machine will give a lot of details on CPU architectures up to Intel's Core 2 Duo. You can also try following along a computer engineering book if you want to go that low in detail with exercises, Digital Fundamentals by Floyd is a common textbook (I have an old 8th edition).

History-wise, enjoy learning slowly because there's so much that even if you dedicated yourself to it you wouldn't be "done" any time soon! Some suggestions in order though:

Watching The Mother of All Demos: https://www.youtube.com/watch?v=yJDv-zdhzMY

A short clip of Sketchpad presented by Alan Kay: https://www.youtube.com/watch?v=495nCzxM9PI

An article from the 40s that also inspired Engelbart: https://www.theatlantic.com/magazine/archive/1945/07/as-we-m...

The Information by James Gleick

What the Dormouse Said by John Markoff

The Psychology of Computer Programming by Gerald Weinberg

Lastly, to mix up in whatever order you please, some paper collections:

Object-Oriented Programming: The CLOS Perspective edited by Andreas Paepcke

History of Programming Languages papers for various langs you're interested in, here's the set from the second conference in 1993 https://dl.acm.org/doi/proceedings/10.1145/154766 but there have been further conferences to check out too if it's interesting

Also all of the Turing Award winners' lectures I've read have been good https://amturing.acm.org/lectures.cfm

All that and some good recommendations others have given should keep you busy for a while!


👤 netsharc
A long while ago I found the Jargon File, it's a "dictionary" of terms used by hackers as the culture was budding at the universities in the 70's. Reading the entries you get a glimpse of the technology and culture of those places at that time. Young me found it really cool in a nerdy way, and read all the entries from front to back. Since this was before the always online times, I was just reading the TXT file from http://jargon-file.org/archive/ rather than needing to navigate the many pages: http://www.catb.org/~esr/jargon/

👤 chillpenguin
In terms of computing history, The Dream Machine by Mitchell Waldrop is incredibly good.

In terms of "how computers work" I agree with others who recommended Elements of Computing Systems (aka nand2tetris).


👤 khaledh
I have the same passion about computing history. I can't count the amount of literature I've read to learn about this fascinating history; it's very satisfying to know when, how, where, and by who original work was done to advance computing. Most of the foundational work in computer architecture and computer science was done in the 50s, 60s, and 70s. From there it has been incremental improvements.

I highly recommend reading "The Dream Machine" by Mitchell Waldrop. It's very well written, and covers a huge swath of computing history, from the ENIAC to the Internet (it was written in 2000).

Instead of recommending specific sources (too many), I can mention key milestones in computing history that you may want to research:

- Theory of computation (Alan Turing, Alonzo Church)

- Early binary systems (John Atanasoff, Konrad Zuse, George Stibitz, Claude Shannon)

- Early computers (ABC, ENIAC, EDSAC, EDVAC, Von Neumann architecture)

- Early programming (Assembly language, David Wheeler, Nathaniel Rochester)

- Early interactive computing (MIT Whirlwind, SAGE, TX-0, TX-2)

- Early mainframes (UNIVAC, IBM 70x series)

- Early programming languages (Speedcoding, Autocode, A-0, A-2, MATH-MATIC, FLOW-MATIC)

- First programming languages (FORTRAN, COBOL, LISP, ALGOL)

- Early operating systems (GM-NAA I/O, BESYS, SOS, IBSYS, FMS)

- Early time-sharing system (MIT CTSS, Multics, DTSS, Berkeley TSS, IBM CP-67)

- Early Virtual Memory (Atlas, Burroughs MCP)

- Early minicomputers (DEC PDP line)

- Mainframe operating systems (IBM OS/360, UNIVAC EXEC)

- Early online transaction processing (SABRE, IBM ACP/TPF)

- Early work on concurrency (Edsger Dijkstra, C.A.R. Hoare, Per Birch Hansen)

- Early database systems (GE IDS, IBM IMS, CODASYL)

- Early Object-Oriented Programming (Simula I, Simula 67, Smalltalk)

- More programming languages (CPL, BCPL, B, C, BASIC, PL/I)

- Mini/Supermini operating systems (Tenex, TOPS-20, VMS)

- Structured Programming (Pascal, Modula, Niklaus Wirth)

- Relational data model and SQL (Codd, Chamberlin, Boyce)

I could keep going on, but this is already too long. I hope this at least puts your feet on the first steps.


👤 listenfaster
Good: split your time between activities and reading something as satisfying as the things you “devour”. To that end, I would plus one Hackers (Levy) and Code (Petzold). Also, the Cathedral and the Bazaar by esr

http://www.catb.org/~esr/writings/cathedral-bazaar/

and other things from esr at

http://www.catb.org/~esr/

including the aforementioned jargon file. Here’s one I hadn’t stumbled on before, ‘Things Every Hacker Once Knew’

http://www.catb.org/~esr/faqs/things-every-hacker-once-knew/

For an activity ymmv depending on how much time you can spend; an alternative to building a computer from scratch, or an OS from scratch, is to buy a vintage cheapie running cp/m or dos, something where the OS isn’t abstracting memory management for you. Growing up in the 80s, I think managing my own memory and _everything_ that implies was the greatest teacher.


👤 als0
If you can manage a day trip to Cambridge (about an hour from London), you should visit the excellent Museum of Computing History http://www.computinghistory.org.uk/

👤 bingaling
The 1992 WGBH/BBC 5-part miniseries "The Machine That Changed The World"(US)/"The Dream Machine"(UK):

https://en.wikipedia.org/wiki/The_Machine_That_Changed_the_W...

is out of print, but can be found intermittently on youtube.

I love the coverage of 1940's computing, with interviews with several of the surviving people:

https://en.wikipedia.org/wiki/Konrad_Zuse

https://en.wikipedia.org/wiki/ENIAC

https://en.wikipedia.org/wiki/Eckert%E2%80%93Mauchly_Compute...

https://en.wikipedia.org/wiki/EDSAC

Currently working episode links:

1: https://www.youtube.com/watch?v=hayi9AsDXDo

2: https://www.youtube.com/watch?v=GropWVbj9wA

3: https://www.youtube.com/watch?v=rTLgAI3G_rs

4: https://www.youtube.com/watch?v=E1zbCU5JnE0

5: https://www.youtube.com/watch?v=vuxYUJv2Jd4


👤 spogbiper
The Advent of Computing podcast may be of interest. The host really strives to find accurate historical information about a variety of early computing topics.

https://adventofcomputing.com/

It's also fairly entertaining


👤 digisign
Was just showing the subject to a youngster recently. Other folks mentioned the Code book, I liked that one. The MMM by Brooks of course. We also looked at the following videos on youtube/Kanopy and other places:

- The Story of Math(s) by Marcus du Sautoy to set the stage... school and taxes in ancient Sumeria, Fibonacci bringing Indian numbers to Europe, and other fascinating subjects.

- We watched short biographies of Babbage and Lovelace, full-length ones of Turing and Von Neumann. The "code breakers" of WWII.

- Top Secret Rosies: The Female "Computers" of WWII, another good one.

- There's more history in PBS' Crash Course Computer science, than you might expect. It is great although so peppy we had to watch at .9x with newpipe. Shows relays, vacuum tubes, to ICs, to the Raspberry Pi. As well as the logic gates they model.

- "The Professor" at Computerphile is a great story teller about the early days.

- There are great videos about CTSS being developed at MIT I think, where they are designing an operating system via paper terminal and trying to decide on how to partition the memory/storage: https://www.youtube.com/watch?v=Q07PhW5sCEk

- The Introducing Unix videos by ATT are straight from the source: https://www.youtube.com/watch?v=tc4ROCJYbm0

- The movie/book "Hidden Figures" touches on this time as well. Facing obsolescence by IBM, one of the characters teaches herself Fortran.

- The Pirates of Silicon Valley is a fun dramatization of the late 70s to 80s PC industry. It specifically calls out the meeting between MS and IBM as the deal of the century. We also watched a "Berkeley in '68" doc on Kanopy to set the stage before this one. Interesting, but a tangent.

- The "8-bit Guy" is also great, he dissects and rebuilds old home computer hardware from the same era, and teaches their history as he does it. Even his tangential videos on why there are no more electronics stores (besides Apple) in malls is great.

- There are good docs on the "dead ends" of the industry as well, such as "General Magic" and "Silicon Cowboys."

- "Revolution OS" a doc about the beginnings of FLOSS and Linux.


👤 tapoxi
Computer Chronicles was a PBS series that ran for 20 years and captured a lot of computer history as it happened, it's a great watch on YouTube: https://youtube.com/user/ComputerChroniclesYT

👤 femto
Try this book:

"Understanding Digital Computers : A Self-learning Programmed Text That Will Teach You the Basics for the Microcomputer Revolution" by Forrest M. Mims III.

It's dated, but the core material is still relevant. Even the dated sections might suit you if you're interested in the history.

In a similar same vein, a few years ago I wrote a course which starts with the idea of a bit and ends with the student programming a computer that they built themselves in a logic simulator.

https://john.daltons.info/teaching/engineering/

The first few lessons meander, as I was still figuring out a direction, so the meat starts at lesson 3. The last lessons are missing (roundtoit), but if there is interest I can put them on-line. From memory all the examples are on-line. Here is the final computer:

https://john.daltons.info/teaching/engineering/simcirjs/comp...

In this example a program is already in memory, so just push "run/stop" to make it run. The instruction set isn't on-line, as it's in the later lessons, which I haven't gotten around to uploading.


👤 oumua_don17
As you are based in London, UK' let me propose a slightly different alternative.

A 1 hour journey leads you to the The Center for Computing History in Cambridge [1]. Please go there and see for yourself and interact with the history of computing. You may also buy one of the maker kits to get started with [2].

I too have a similar keen interest and there are some fantastic volunteering opportunities to deep dive and learn about the history. [3]

And there was this awesome Gaming Generations Exhibition that just got over last week [4].

You could combine all this with other equally fantastic solutions proposed here (Ben eater's videos, Nand2Tetris) etc.

That hopefully makes for a fun, interactive way of satiating your good hunger :-)

[1] http://www.computinghistory.org.uk

[2]http://www.computinghistory.org.uk/det/50229/MyZ80-Maker-Kit...

[3]http://www.computinghistory.org.uk/pages/14522/volunteering/

[4]http://www.computinghistory.org.uk/det/66270/gaming-generati...


👤 vincent-manis
I'm not at all sure that learning about computer history and learning about how computers work are the same thing. For example, looking at early microprocessors would give you the idea that instruction set architectures are completely random when in fact their designers were faced with a limited transistor budget and very short development times. Often, microprocessors were offered as a replacement for discrete logic, rather than as generally programmable computing devices.

The history of computing is replete with really dumb ideas, from addition and multiplication tables in memory (IBM 1620) to processors optimized for Ada that ran too slowly to be useful (Intel iAPX 432). There were really smart ideas, too, such as cache (IBM System/360 Model 85) and RISC (too many systems to mention). What you want is just the smart ideas, I'd say.

If you want to get an understanding of how modern computers work, and given your CS degree, I would recommend David Patterson/John Hennessy's Computer Organization and Design, any edition. A lot of universities use this book in a second-year architecture course.

In terms of relating this information to the overall hierarchy of computer systems, I would also recommend Nisan and Schocken's Elements of Computing Systems.


👤 arman_ashrafian
"The Dream Machine" by M. Mitchell Waldrop.

It tells the history of computing by following J.C.R Licklider. As one of the directors of ARPA, he was responsible for funding research labs to work on computer research. He had a major impact on which projects got funded, and in-turn which systems are now being used 60 years later. I honestly love this book so much. If you love computers and history, its a must-read.


👤 srvmshr
I think the way to go about it is read some books gradually on history of computing & the various designs and rationale that evolved over time. Consider these sources:

1. The Annotated Turing.

2. A History of Modern Computing 3ed

3. The ACM Turing award lectures

4. Theory of computation - Dexter Kozen

5. Coders at Work

6. Hackers: Heroes of the computer revolution

Additionally, you could subscribe to Communications of ACM, which is a computing oriented monthly magazine.


👤 firebirdm
I'm coming from a similar background and asked myself that exact question :)

These two books helped me much already:

Programming from the Ground Up by Jonathan Bartlett - A very good introduction to assembly

Learning Computer Architecture with Raspberry Pi by Eben Upton - Great read about the inner working of memory and the CPU, with reference to the past and how things developed


👤 atrn
Lots of good links getting posted. Another interesting resource is the ACM's History of Programming Languages (HOPL) proceedings,

https://dl.acm.org/conference/hopl/proceedings


👤 slyall
The Journal "IEEE Annals of the History of Computing" might be a good source. It has been published for over 40 years.

https://www.computer.org/csdl/magazine/an


👤 stevenbedrick
In addition to the great books listed here, a few more that might be of interest:

- Turing’s Cathedral, by George Dyson

- Black Software, by Charlton McIlwain

- Programmed Inequality, by Mar Hicks

The Dyson book is a rigorous and deep historical dive into the philosophical and practical origins of digital computing, and is really great.

The other two are equally great and deep but cover computing history through different lenses. The Hicks book in particular may be of interest for you, as its emphasis is on the history of computing in the UK. They’re less directly about how computers “work”, as such, and more about how computers and society have interacted with one another in interesting and non-obvious ways, and how those interactions have impacted the ways in which technologies have developed.


👤 kwatsonafter
Ted Nelson's YouTube channel: https://www.youtube.com/user/TheTedNelson

It's also worth looking at: https://www.youtube.com/user/yoshikiohshima. There's a goldmine of talks by people like Alan Kay and Seymour Papert. An important question to ask when, "probing" the literature-- why are computers the way they are in terms of human-computer interaction and human culture? What is a, "computer" without making an appeal to mathematical concepts like Turing Machines/Lambda Calculus? What are the major, "paradigm shifts" that gave us GUIs, mice, ect...?

It's worth noting that the history of popular computers parallels almost exactly the neoliberal economic period. Atari was founded in 1972. Look into the Mansfield Amendment and ARPA. Try to get past a cultural myth that computer companies started in, "normal" people's garages. Try to see past the, "present concept." Alan Kay has famously said, "The computer revolution hasn't happened yet." It's up to the current/future generations to, "really" define what computers are in terms of human culture. Think, "living history." Think, "world before-after the invention of the Gutenberg printing press."

https://www.nsf.gov/nsb/documents/2000/nsb00215/nsb50/1970/m...

https://en.wikipedia.org/wiki/Douglas_Engelbart

https://www.theatlantic.com/magazine/archive/1945/07/as-we-m...


👤 machiaweliczny
My practical recommendations:

  * understand brainfuck or so called RAM machine as simplest computer
  * read 50 pages of https://en.m.wikipedia.org/wiki/Code:_The_Hidden_Language_of_Computer_Hardware_and_Software
  * read https://www.bottomupcs.com/ to understand low level stuff
  * Learn some C
To understand computation I think scheme or lambda calculus is the best. Don’t know good intro.

Bear in mind that what we have is just certain implementation/abstraction for computation that’s likely still suboptimal. That’s why people come with new languages/VMs. I wonder if some alternative to RAM machine exists. I’ve heard about lisp machines…


👤 ivan_ah
IEEE has a special interest group called Silicon Valley Technology History Committee, which regularly hosts talks/discussions: https://r6.ieee.org/sv-techhistory/?page_id=320

Here is an example link from a recent session on the history of Ethernet networking standard: [ Ethernet’s Emergence from Xerox PARC: 1975-1980 ] https://www.youtube.com/watch?v=SVEcqZnGya0


👤 psahgal
I have a bachelor's in Computer Engineering from University of Illinois at Urbana-Champaign and several of my courses covered how computers work in detail!

- ECE 190 and ECE 290 covered basic programming, logic gates, and the basics of software processor architecture.

- ECE 391 (one of the hardest courses in the school) covered x86 assembly and operating system design. The capstone project for the course was to build a simple OS with terminal input.

- ECE 411 covered processor architecture in detail, and how a modern x86 processor is built.

There should be courses from other universities that cover the same topics. Here's some similar courses I found on MIT's OpenCourseware platform.

- Computation Structures covers logic gates and other standard electronic constructs. https://ocw.mit.edu/courses/6-004-computation-structures-spr...

- Operating Systems Engineering covers fundamentals of operating system design: https://ocw.mit.edu/courses/6-828-operating-system-engineeri...

Best of luck!


👤 SilasX
Just to piggyback on, I'd be interested in the pre-computer history of computing. That is, a survey of how they handled all the computation problems before (electronic) computers. Like, storing large amounts of data, having "databases" that need to answer queries over a large geographic area, how they replicated "databases", how they indexed information, how they did backups, and so on.

👤 krallja
The textbooks I used in university were "From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry" (Campbell-Kelly) and "A History of Modern Computing (second edition)" (Ceruzzi). There is a brand-new update to the second one, "A New History of Modern Computing" (Haigh/Ceruzzi) that I'm looking forward to reading this summer.

👤 sk1pper
In the vein of learning how computers work, osdev has got to be pretty high up there. It’s so much fun - it has become my hobby. I’m surprised no one else seems to have mentioned it.

I just finished implementing a really basic network stack for my x86 kernel, including a (crappy) driver for the RTL 8139 network card. I just learned a ton about how the internet works. I learned it all in college, but there’s something different about grappling with it directly.

And I’ve gotten pretty good at C in the meantime. I’ve also learned a ton about virtual memory, page tables, system calls, various hardware, how data is stored on disk, the ELF file format, how processes are loaded and executed, the x86 calling convention, a little bit of assembly code, just to name a few.

Check out https://wiki.osdev.org for where to start. I’m hoping to start writing some blog posts about all of this soon, to provide a resource to complement the osdev wiki. A lot of info on this is surprisingly hard to dig up.


👤 m1keil
One resource I stumbled upon was The Dream Machine book. Very broad historic overview of computing history throughout last 70 years or so.

👤 ModernMech
I did a quick search and I didn’t see HOPL mentioned (History of Programming Languages). You can learn a lot about this history of computing in general by going through that workshop series. HOPL IV was just last year and had some great talks. https://hopl4.sigplan.org

👤 andyjohnson0
Turing's Cathedral by George Dyson is a good source on the history and development of computation in the 1930s through to the 50s. It's very centred on the work that was done at Princeton by Von Neumann et al [1] and lacks coverage of important work that was going on at the same time in Germany, the UK, and other places.

You might want to look into how the idea of computation came out of mathematical work in the early twentieth century. The Annotated Turing by Charles Petzold is good if you're up for some maths.

Aerospace and spaceflight were some of the first activities that required large-scale software development. You could check-out Starburst and Luminary by Don Eyles and Digital Apollo by David Mindell.

[1] The author's father was Freman Dyson who was at the Institute for Advanced Study (at Princeton) with Einstein, Gödel and others.


👤 shadowofneptune
If you like podcasts, there is Advent of Computing. It's not chronological, instead covering a different topic every episode. Most recent episodes are about magnetic core memory, INTERCAL, a hypertext system developed by the US military, and the Analytical Engine, respectively. There's over 80 episodes now so there's a lot to learn about.

Website: https://adventofcomputing.com/ RSS: https://adventofcomputing.libsyn.com/rss

If you want an idea of how computers work, there are toy virtual machines that are a good teaching tool (https://peterhigginson.co.uk/RISC/).


👤 gompertz
I find reading old issues of Byte magazine from 1975 up to around 1989 to be very educational. There appears to be a complete archive herehttps://archive.org/details/Byte-Magazine-Complete.

👤 peterkos
On the more "history" side of things -- Podcasts!

My very first introduction to anything "old" tech was through the TechStuff podcast[0] (re: 2011-era episodes, so sort by oldest).

More recently the On The Metal podcast[1] has been a really cool deep dive into old tech history, especially the episode (season 2) with John Graham Cumming.

About implementations, my first real playing around with assembly was "Learn TI-83 Plus Assembly in 28 days"[2].

[0]: https://player.fm/series/techstuff-2152808

[1]: https://oxide.computer/podcasts

[2]: https://tutorials.eeems.ca/ASMin28Days/welcome.html


👤 d136o
I love this question because I’ve also been fascinated with the history of the field, some suggestions below.

From September 2021:

A new history of modern computing

https://mitpress.mit.edu/books/new-history-modern-computing

Skip around it’s various chapters, it’s fun of little details.

Also a fun read, this old article about the silicon in Silicon Valley, it’s from a long dead magazine and it’s titled They Would be Gods:

https://www.dropbox.com/s/l9mi2aqnyf5fp3l/They%20Would%20Be%...

Lastly, the part I enjoyed the most of Walter Isaacsons bio on Jobs, was the adjacent history.


👤 evo_9
Worth finding and watching is the three-part PBS series called Triumph of the Nerds hosted by Robert X Cringley.

It covers the rise of the PC up until the early 90s and has interviews of everybody including Bill Gates Steve Jobs Larry Ellison etc. etc.… It’s pretty amazing.


👤 photochemsyn
This site has a nice timeline of computer development history dating back to the 1930s:

https://www.computerhistory.org/timeline/computers/


👤 enahs-sf
Computer history museum in San Jose is pretty cool.

👤 jonjacky
Many many suggestions two years ago in this Ask HN: Computer Science/History Books?

https://news.ycombinator.com/item?id=22692281


👤 jll29
1. Visit Bletchley Park and the attached computer history museum.

2. Check out a recent computing history book like: Thomas Haigh and Paul E. Ceruzzi (2021) A New History of Modern Computing, Cambridge, MA, USA: MIT Press.


👤 asciimov
I highly enjoy the nandgame[0]. Its a game that goes from the basics of building simple logic gates all the way up through building memory and an ALU. While you can go into it blind expect to need to study and look up a ton of stuff if you have never had an intro-electronics course. Best of all it's a free web game.

Classes to look into. An intro course in microcontrollers would be a good place to start. Usually you will find them attached to the Electrical Engineering department. Maybe take a course in Circuits or Computer Architecture.

[0] - www.nandgame.com



👤 scp3125
It's tangential, but "Where Wizards Stay Up Late: The Origins of the Internet" by Katie Hafner and Matthew Lyon is one of my favorite books on the foundations of the Internet.

👤 gravypod
A great summary, from someone who was influential in web technologies, can be found in this series "Crockford on JavaScript": https://www.youtube.com/watch?v=JxAXlJEmNMg&list=PL766437924...

(context: https://en.wikipedia.org/wiki/Douglas_Crockford)


👤 smackeyacky
Youtube "The computer chronicles"

Fascinating show, mostly about micros but featuring early industry legend Gary Kildall.

The software reviews are hilarious. The predictions of the future of computing always wrong. The guests demoing stuff always cut off as soon as it gets interesting.

I started programming as a teenager in that era but never saw the show in period. For me its eye opening just how amateur the industry really was. The show is unintentionally funny now, but really gives a great idea of the time period.


👤 asteroidimpact
I found this to be a pleasant primer before delving deeper into the subject. Though, as you can see, as with all things, there are different takes on it's merit based on where people are coming from.

https://www.goodreads.com/book/show/191355.Darwin_Among_The_...


👤 rg111
Read these books-

- Innovators by Walter Isaacson

- Code by Charles Petzold

- The Annotated Turing by Charles Petzold

- Where The Wizards Stay Up Late by Katie Hafner

- The Information by James Gleick


👤 jeffjeel
Though it's geared more for the non-CS, general population, I found Crash Course Computer Science with Carrie Ann Philbin to explain concepts clearly and it's entertaining https://www.youtube.com/playlist?list=PL8dPuuaLjXtNlUrzyH5r6...

👤 fourthark
If you have the money and time to take classes, I'd recommend which ever of the standard CS foundation courses interest you:

- Programming Languages (and then Compilers, my favorite)

- Algorithms

- Operating Systems

At a decent school with some level of difficulty, you'll learn the big picture while doing fun projects for homework, along with history.

Programming is a craft, not a science, but it overlaps with math in a lot of places.


👤 BlasDeLezo
A Computer Called LEO, by Georgina Ferry. This book tells the story of how Lyons teashops created LEO, the first business computer. It also tells the story of early computing, from the Difference Engine of Charles Babbage to the codecracking computers at Bletchley Park and the ENIAC in the US, and the story of postwar British computer business.

👤 89vision
https://oxide.computer/podcasts

This podcast is everything


👤 mandeepj
> I've become very aware that JS/TS and Swift etc are just APIs on top of APIs.

I'd call them abstractions. This is how it's - high level programming languages (C#) are just abstractions so that we don't have to remember machine instructions. Similar to a name in Contacts connecting to a phone number; the former is far easy to remember.


👤 mhh__
Read papers by the people who actually invented things. I've been doing this a little recently, it's very eye opening.

👤 not-bob-
I had aspirations on writing a book on 'pre-"software engineering" history of software' that hasn't made much progress.

I used NATO's conferences in 1968 and 1969 on "The Software Problem" as my inspiration.

Now that the ACM digital library is available without subscription, that would be a good resource of their publications.


👤 markus_zhang
Maybe buy a raspberry pi pico and code it in assembly?

Or try to find a retro computer, e.g. a BBC micro and start programming it for fun?


👤 denvaar
PBS made a "crash course" computer science series that covers a lot of topics https://www.youtube.com/watch?v=tpIctyqH29Q&list=PLH2l6uzC4U...

👤 echoradio
“How Computers Really Work” by Matthew Justice. [1]

I enjoyed this book because every chapter includes hands-on hardware and software experiments for you to see the concept described in action.

[1] https://www.howcomputersreallywork.com/


👤 NuSkooler
Maybe not exactly what you're looking for, but Xibalba BBS (https://xibalba.l33t.codes for a web UI) hosts a ton of articles on computing history in the files section.

👤 ev0lv
My HS teacher made a pretty good high level video on computing history. I recommend starting there.

https://www.youtube.com/watch?v=MZ3tSPF83yo


👤 simonebrunozzi
Obligatory mention: "Secret History of Silicon Valley" by Steve Blank [0].

[0]: https://steveblank.com/secret-history/


👤 DanEEStar
I enjoyed this book recently:

Computer: A History of the Information Machine by Martin Campbell-Kelly, William Aspray

But this is a first of all a history book. But nonetheless I learned a lot of things!


👤 shreyshnaccount
A tangent, but you might find interesting starting points from the nand2tetis course. ( www.nand2tetris.org) And reading seminal papers by the likes of Turing and Church

👤 devmunchies
The Acquired podcast is really good. Both speakers have CS degrees but are VCs now. Learned a ton from the TSMC, NVIDIA, Sony, A16Z, Epic, and Sequoia episodes.

👤 kentlyons
If you're ever in Silicon Valley, take a stroll through the Computer History Museum (which used to be SGI for some meta history fun).

👤 Derek0116
_UNIX A History and a Memoir_ by Brian W Kernighan

👤 gsinclair
I enjoyed a book called “The Binary Revolution” by Neil Barrett. It gave me a good sense of computing history from c1930 to c2000.

👤 poiuiopkj
I would recommend: The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution - Walter Isaacson

👤 t312227
hello,

about 2 years ago there was a similar thread here @ HN

* https://news.ycombinator.com/item?id=22907211

awesome-computer history

* https://github.com/watson/awesome-computer-history

br v



👤 timzaman
Computer History Museum in Mountain View (CA) is amazing. Can spend half a day there.

👤 NateEag
A few recommendations from an avid armchair computer historian:

- Dealers of Lightning is a wonderful book covering Xerox PARC's history and contributions. If you don't know what Xerox PARC is, then you should definitely read it.

- Where Wizards Stay Up Late is a highly-readable, engaging book covering the history of the Internet's early development.

- Soul of a New Machine gives a compelling glimpse into the era when physical machines and unique architectures were more dominant than software in shaping the market.

- The Jargon File as maintained by Eric Raymond is not without controversy, but I think it's still fair to say a lot of computing folklore and cultural history is preserved there. http://www.catb.org/jargon/html/index.html

- Folklore.org is a wonderful, wistful collection of stories from the early days of Apple Computer, as told by some of the engineers and programmers who made it what it was in the 80s and early 90s.

- The Thrilling Adventures of Lovelace And Babbage is a wonderful graphic novel that's full of utterly ridiculous fiction that's only loosely inspired by the title characters. However, it is jam-packed with footnotes about the actual history from top to bottom, and in my opinion, there probably isn't a better or more fascinating glimpse of the proto-history of the computer anywhere.

- Douglas Engelbart's Mother Of All Demos is well worth watching (can be found on YouTube), and maybe reading some commentary on. Mind-blowing what his team put together that we still haven't really matched in some ways.

- Vannevar Bush's piece "As We May Think" isn't really about computers, but it's hard not to connect it to them when you read it. And then maybe to sigh and wonder how someone who didn't have any machine like what he describes can have a vision more compelling than what we've actually managed to build, so many decades before it happened.

- If you're interested in hypertext, look into Ted Nelson. None of his work ever really took off, and Project Xanadu was a legendary mishandled undertaking, but his vision for what might have been is fascinating, and influenced many of the software pioneers, as I understand it.

- This glorious video of using a 1930s teletype as a command-line Linux terminal taught me a surprising amount about why the classic Unix tools work as they do. https://www.youtube.com/watch?v=2XLZ4Z8LpEE

Enjoy!


👤 watersb
Live a long time and never stop learning.

👤 AlexCoventry
You may find the history of the first large-scale digital computer interesting.

https://www.amazon.com/Colossus-secrets-Bletchley-code-break...

It was used by the British to break the Lorenz cipher (which the Nazis used to encrypt high-level strategic communications.)


👤 davidf18
There are YouTube videos of Seymour Cray.

Learn about the CDC 6600 and the architecture compared with top IBM 360 computers. Learn about Cray I.

Cray was the builder of the fastest computers in the world for a long time. Always sold early machines of each model to scientific govt labs and NSA. The first 6600s were delivered to Livermore and Los Alamos (Wikipedia CDC 6600).

With less than 30 people Cray built a computer (the 6600) about 2 times faster than anything IBM with its massive budget could build. There is a famous letter by IBM's Chief Watson Jr. about this fact at The Computer History Museum website.

When IBM used ICs for the 360's, Cray still used transistors for the CDC 6600. His reasoning is great.


👤 DontMindit
If you wish to learn how COMPUTING works not computers.... then Rosens book on Discrete Mathematics is the MASTER KEY