Got me wondering - we've had way too many shows now dealing with modern digital culture (hello Silicon Valley). There's been some attempts to tell stories from the early days of microcomputers (Halt And Catch Fire for instance), and that makes sense, given that those are the machines that many GenX played with at home or at school.
But Hollywood's version of Silicon Valley - Silvercon Valley? - has yet to celebrate the Minicomputer and Mainframe eras in a real way. They have done a bit of deep diving into the 50s, but only in the form of standard issue Genius Porn where the computer operators who broke wartime codes were like Harry Potter characters or something, with modern social causes overlaid as a historical corrective of sorts, and I somewhat doubt that Turing is any happier for the fact that the world no longer cares that he was gay, but remains unable to grasp the awesome stuff he did do. He's not here for it.
Grim thoughts aside, these PDP-8s and the people who worked with them daily deserve a story, complete with looooots of pornographic shots of the machines themselves. Love stories that begin by someone bumping their true love's pile of punchcards into a game of '52 pickup that ends with lurid shots of stacks of cards being sucked into the reader.
Anyways, the best story would heroize someone real from the era. I know a lot of stories but cannot really find the one that really deserves someone having a go at making it into a script set in that gorgeous era of earthtoned data.
What do you think, HN?
"The Soul of a New Machine is a non-fiction book written by Tracy Kidder and published in 1981. It chronicles the experiences of a computer engineering team racing to design a next-generation computer at a blistering pace under tremendous pressure. The machine was launched in 1980 as the Data General Eclipse MV/8000.[1]
The book, whose author was described by the New York Times as having "elevated it to a high level of narrative art"[2] is "about real people working on a real computer for a real company,"[3] and it won the 1982 National Book Award for Nonfiction[4] and a Pulitzer Prize for General Non-Fiction."
There's a story about the IBM System/360 Model 75. The engineers who designed the Model 65 put the registers on roller blinds, so that one row of flashing lights could display one of several different registers. The plan was to do the same for the much-faster Model 75. Senior management is said to have nixed that: if the machine costs 3 times as much as the 65, it must have 3 times as many lights. So the 75 lost its roller blinds.
https://www.amazon.com/UNIX-History-Memoir-Brian-Kernighan/d...
He was working for the Columbia, SC city government on maintaining the code on their mainframe. However, due to some bureaucratic mixup, he never got the parking pass he was supposed to get, and kept getting parking tickets. But, as it so happened, the tickets were issued by the mainframe he was programming. So he just changed the code to make all of his parking tickets disappear.
https://www.youtube.com/watch?v=EY6q5dv_B-o
...not to mention Ken's story of Doug McIlroy writing a compiler for TCG on a sheet of paper, then using it to compile itself - by hand - thus "feeding his sheet of paper his sheet of paper."
Runner-up: the story of Texas instruments, and their innovative use of cheap transistors.
I'd like to suggest a larger project: a history of the features we've come to call a "computer". For example, the stack. The PDP-8 didn't have one. Programmers would roll their own in software, usually with the limitation that you could only have 1 level of call, because you only stored one return address.
So; what machine gave us the first hardware version of the stack? When did using a stack become the job of the designer, not the programmer? As far as I can tell, the first one was the Burroughs B5000. In fact, I'd wager the B5000 architecture pretty much set the standard for all computers that followed.
But I think that story would be really interesting. As long as it's not only about the tech, but about all the people who tried, and the one(s) who succeeded. Not just a single feature, but the set of features that we now consider to be required for a "real" computer.
There were a few decades of research into symbolic AI and expert systems - "Good Old-Fashioned Artificial Intelligence" before the modern deep-learning revolution. Lots of promise, interesting research, and very cool (and esoteric) hardware came out of it, which never found wide application outside the lab. You could pitch it as a sort of analogue to Who Killed the Electric Car? (2006).
Lee Felsenstein - Homebrew Compter Club, Osborne 1, Sol-20 (this + Community Memory is a good story)
Ed Roberts - MITS
Nat Wadsworth - created a PC in 1973, heartbraking story http://www.willegal.net/feature_stories/Nat%20Wadsworth%20-%...
Dr Robert Suding - Create Digital Group computers
Robert Noyce - Founder of Fairfield semi and Intel. Really good book called The Man Behind The Microchip
[1] "How to Think about Parallel Programming: Not!" - Guy L. Steele Jr. (Strange Loop 2010) https://www.youtube.com/watch?v=dPK6t7echuA
The first part of the book describes MIT's Project MAC and AI Lab, from the TX-0 to the PDP-10.
The first third of Hackers by Steven Levy has some interesting stories about the PDP-8 I think also could be adapted to film.
(At least if you aren't completely set on the PDP-8.)