HACKER Q&A
📣 amichail

Could early 80s computers have had better software given today's CS?


For example, could they have had a better built-in language using the same hardware?


  👤 Animats Accepted Answer ✓
Yes.

Languages

* C should have had standard types comparable to Rust's Vec and String. Pascal did. Null terminated strings were a terrible idea, and are still a curse. Plus a sane set of string functions, not "strcat" and its posse. Didn't have to be part of a general object system. Should have happened early.

* Slices. Easy to implement in a compiler, eliminates most of the need for pointer arithmetic. Again, should have happened early.

Those two alone would have eliminated tens of thousands of pointer bugs and decades of security holes.

Operating systems

* UNIX should have had better interprocess communication. It took decades to get that in, and it's still not that good in Linux. QNX had that figured out in the 1980s. There were some distributed UNIX variants that had it. System V had something.

Networking

* Alongside TCP, there should have been a reliable message-oriented protocol. Send a message, of any length, it gets delivered reliably. Not a stream, so you don't have issues around "is there more coming?". There are RFCs for such protocols, but they never went anywhere.

Databases

* Barely existed. This resulted in many hokey workarounds.

With all of those, we could have had CRUD apps by the late 1980s. That covers a whole range of use cases. There were client/server systems, but each had its own system of client/server interconnection. With all of the above, it would have been much easier to write basic server applications.


👤 JohnFen
No, I don't think so. Today's software engineering is optimizing for minimizing dev time, and this comes at the expense of using more resources than are necessary to accomplish our tasks. Yesterday's software engineering was optimizing for minimizing resource use rather than dev time. That mindset makes a tremendous difference in how you approach the engineering.

👤 outworlder
Computer Science hasn't advanced all that much. Some algorithms have been improved but it is a very slow process, as one would expect.

When it comes to 'software engineering', such as it is, we have regressed. We are incredibly wasteful, in the name of productivity. Some of that trade-off makes sense, but a lot doesn't, and there's diminishing returns. The microservices trend, that's spread like wildfire, is a good example. It makes sense in some cases, but the additional resource consumption is astronomical, both in computer resources, and humans, which is something it's supposed to help with. But the culture moved to an extent that, if you say a particular system is better off served by a monolith, you are seen as an alien in the best case, and a an unskilled hack in the worst case.

Sure, your team can package a microservice in a container, using any language, expose an API, ship it to kubernetes and publish your API spec without having to talk to anyone. Now, you have a whole other team managing your K8s and AWS environments, a whole bunch managing the different database systems that are isolated per service, yet another dealing with whatever messaging mechanism you have probably added once direct API calls became an issue, and you have a full team of Secops people deploying the latest and greatest tools from CNCF to fix the problems the architecture created. While a lot of that would have been, in other times, been solved with a function call. Worse yet, any runtime environments are duplicated, and even then simplest service now demands multiple gigabytes to run. Add HA requirements and you have an enormous overhead, to the point that deploying multiple copies of the monolith would have been far simpler – and much cheaper.

In the 80s, we had Lisp Machines. Nothing has come close ever since. Maybe if we had stuck with S-Expressions, we wouldn't have to reinvent markup languages every 5 years (XML, JSON, YAML, etc). Even without them, the Lisp condition system is amazing.


👤 edent
Yes.

I wrote about it at https://shkspr.mobi/blog/2020/11/what-would-happen-if-comput...

Look at early games on the Sega Megadrive / Genesis compared to the ones released towards the end of its run.

As we learn more about computer science, we can push systems further than their creators envisaged.


👤 amszmidt
Lisp Machines had high resolution displays, loads of disk space, a language that was mostly type safe, with built in editor with capabilities that modern IDEs still often lack, email, etc ...

Here is a pretty brochure for the LM-2 (a relabelled MIT CADR): http://www.bitsavers.org/pdf/symbolics/brochures/LM-2.pdf

But would you have paid 80k USD for one (http://www.bitsavers.org/pdf/symbolics/LM-2/LM-2_Price_List_...) -- and that is without monitor, disk, ...?


👤 thibaut_barrere
I started programming in 1984. I have no doubt that if those computers were suddenly all we had, we would leverage them in surprisingly interesting ways.

A classic example is https://www.vogons.org/viewtopic.php?t=89435, which shows what can be done “today” with old hardware, which was thought impossible for all the years before.


👤 taeric
There have undoubtedly been advances in algorithms that are hard to really communicate to people. However, I think it is also arguable that the advance in computing resources has enabled some of the algorithm advances that people would use in these discussions.

That said, the discussion on "better built-in languages" feels like it is aiming at something I don't understand. Older computers booting into a programming environment was something that worked really well for what it was aiming to do. You could argue that BASIC was a terrible language, but there were other options.

I think the current raspberry pi setup is quite a good example for what you are speaking to? Mathematica is preloaded with a free license and is near magical compared to the programs most of us are writing.


👤 spacephysics
Guess it depends what you mean by better. Also, in today’s context we have exponentially more complexity thats abstracted away. OS, drivers, user land software, heck even browsers alone.

Given this, I do think we have a lot of glue and duct tape fastening our digital world together.

But some of the low level stuff done today is probably just as good as would be done in the 80’s if not more, given the now easier access to information, and complexity requirements to even participate in low level development being higher.

I imagine a dev from the 80’s (like my professor from college) may be dismally annoyed at the amount of abstractions some languages have.

Spring boot for Java? So much ‘magic’. He’d say “just write it in PHP” lol


👤 leshokunin
It's a very blurry thing to answer as the other replies highlight.

If you accept data points from the 90s, the work that Kaze Emanuar is doing on Nintendo 64 engineering improvements is a good indicationthat it's possible. Improved compilers, more modern algorithms, rethinking rasterization and how everything happens on device.

There's a lot of "what if we had infinite time to optimize for this stack" for sure. His comparison of various approaches to solve fast inverse squaee root on N64 hardware is really impressive.


👤 aidenn0
One thing to understand is that, for home computers, the processing power was nearly free, but the storage (RAM and disk) were brutally expensive. An 8 or 16-bit home-computer from the early '80s could execute more instructions per second than a PDP-10 from 10-15 years ago, but had less RAM (Most 8-bit systems had no more than 64k, and the original IBM 5150 maxed out at 256k[1]) and was probably limited to a cassette or floppy disk for storage (no hard drive).

So when you see e.g. demos of SymbOS[2] running on an MSX2 (which at 1985 barely qualifies as "early 80s") with a 1MB ram expansion, it's not really the sort of thing that people would have had at home in the early 80s.

1: The PC was offered in configurations as low as 16k, models with less than 64k were fairly quickly discontinued.

2: https://en.wikipedia.org/wiki/SymbOS


👤 mrweasel
The reverse question is what interests me. Modern computers are so much faster, yet we need an ever growing amount of them to solve "the same" problems.

If you could run a small bank or insurance company on a beefy mainframe back in the 80s, then why can't I do the same today with faster hardware? Sure there's more work for the computer to do, but it's also much much faster and as almost infinite more storage space.

How much hardware would you need to run something like HN, if you wrote the code with the same care and understanding of the underlying machine architecture as a 80s software developer? And HN is probably already a good example of how much you can do with very little software.


👤 jmclnx
I doubt it, in the early 80s, 16bit minis were common, so there were memory limits and segment sizes we do not have to worry about at all these days.

I did not get to program on 32 bit minis until maybe around 1988 or so. By then, minis were just starting to be replaced by PCs.

PCs kind of stopped companies from developing better minis. Where I was, by the late 80s, the high-end mini from the vendor I worked for was a borderline mainframe, maybe even a mainframe. But sales were falling at that point.


👤 jlarcombe
There are numerous projects completed today for "retro" computers that run on unmodified hardware that would have been almost unimaginable "back in the day" and the vastly more powerful development tools (running on today's hardware of course) is surely a big factor. The fact that you can debug a game for the BBC Micro, say, by emulating the entire system and seeing at a glance frame by frame what everything is doing makes some tricks feasible that would have been incredibly difficult to pull off in the machines hey day.

👤 surfingdino
They had the best software given the constraints of the architecture. Basic was the best fit for 8-bit systems with tiny amounts of RAM. The interpreter was easy to implement and port. As soon as we moved into the 16-bit era more languages became available, because the hardware became more powerful. It was also at that time when computers stopped being shipped with built-in languages and began to be shipped with a bundled programming language or without one.

👤 throwaway48540

👤 phkahler
Yes. They could have had better software given the CS of the 80's too. My first computer was an Interact [1]. The best games for it came out after production stopped. Having fully dissected the ROM I can say filling the 2nd (empty) ROM socket would have allowed much faster implementations of the built-in functions and some additional ones. The machine was so short-lived nobody ever maximized its potential.

On the hardware side, the graphics were very low-res with each pixel being 3 scan lines high. Simply adding RAM (there was an upgrade available to go from 16K to 32K) to the base machine should have allowed higher resolution graphics by not re-using pixel data so much or at all (simple hardware changes). Higher horizontal resolution should have been possible but with more complex hardware changes.

I'm looking forward to the end of scaling. I'm hoping that leads to more efficient software at all levels over time.

[1]: https://en.wikipedia.org/wiki/Interact_Home_Computer


👤 gizajob
I don’t see how Elite on the BBC Micro could have been written any other way. Today’s CS couldn’t have got the job done. Same with Manic Miner on the ZX Spectrum. Both involved extremes of optimisation using assembly to totally max out the limited machines. I respect those two pieces of software as works of art a million times more than anything written in the past 20 years.

👤 hnlmorg
A lot of the decisions made in the 80s were more based around hardware limitations. For example BASIC was chosen not because it was the best programming language for home computers but because it was able to fit on tiny ROMs on systems with between 16 and 64 kilobytes (!!!) of RAM.

The reason we have DOS is because CP/M could run on those systems too. A Unix-like OS wouldn’t even come remotely close to running on early 80s micro computers.

And much has been said about the problems of UNIXs design but the reason it was designed they way it was is entirely due to limitations in the hardware of its era too.

Unfortunately once a standard has been set, it’s very hard to change. It’s why there is still DOS remnants in modern NT-based Windows, why Linux and macOS still have TTYs, and why C-based languages even though decades of CS have given us far better principles to build from.


👤 signaru
Maybe cheating a bit if we allow compilers to run on more powerful machines, different from where the program will run. It could be possible to utilize more modern or "ergonomic" programming languages or rely on convenient libraries (assuming they are also efficiently coded). Programming languages in the 80s would not be as fun as what we have today partly because they were designed for limited hardware and partly because there's not a lot of other languages to learn from yet. IDEs, tooling and libraries have also come a long way.

Knowledge access is also something that has improved significantly. Programmers back then may not have had all the information that could have improved their work.


👤 eesmith
Here's one clear "yes". Starting in 1979, Infocom developed the Z-machine for their text adventure games - a virtual machine which made it easier to port across different microcomputers.

During the 1980s, Infocom developers wrote games in ZIL, which was compiled to the Z-machine.

In 1993 Graham Nelson released Inform, which compiled to the Z-machine. Quoting https://www.filfre.net/2019/11/new-tricks-for-an-old-z-machi... :

> This event is considered by most of us who have seriously thought about the history of text adventures in the post-Infocom era to mark the first stirrings of the Interactive Fiction Renaissance, and with them the beginning of an interactive-fiction community that remains as artistically vibrant as ever today. ...

> It wasn’t until some years after Inform had kick-started the Interactive Fiction Renaissance that enough ZIL code was recovered to give a reasonable basis for comparison. The answer, we now know, is that Inform resembles ZIL not at all in terms of syntax. Indeed, the two make for a fascinating case study in how different minds, working on the same problem and equipped with pretty much the same set of tools for doing so, can arrive at radically different solutions. ...

> Having been designed when the gospel of object-oriented programming was still in its infancy, ZIL, while remarkable for embracing object-oriented principles to the extent it does, utilizes them in a slightly sketchy way, via pointers to functions which have to be defined elsewhere in the code. ...

> The Inform standard library, by contrast, was full-featured, rigorous, and exacting by the time the language reached maturity — in many ways a more impressive achievement than the actual programming language which undergirded it. ...

> Because it was coded so much more efficiently than Infocom’s ad-hoc efforts, this standard library allowed an Inform game to pack notably more content into a given number of kilobytes.


👤 RIMR
Yes, absolutely. Look at the leaps and bounds the Demoscene has made in sizecoding for older computers. There's an upper limit, but things are already being done on older 70's and 80's computer that people didn't know how to do back then.

👤 drewcoo
Does that mean Apple ][ and Commodore 64 software or does it mean PLATO, everything from Xerox PARC, and Lisp machines?

👤 rjsw
Atari 8 bit machines had Action! [1].

[1] https://en.wikipedia.org/wiki/Action!_(programming_language)


👤 robbiewxyz
I'd love to hear from someone who's used both e.g. MicroPython and e.g. COBOL.

Both are resource efficient but probably have fascinating differences in what they're like to use.


👤 h2odragon
there are more TCP/IP stacks for DOS than there were back when DOS was widely used. so, yes.

Leaving aside the evolution of language designs: we's gone through several "revolutions" since that time. object oriented, etc etc. say all you like about the subtle power of LISP. the fact that there's so much Python code out there doing stuff, and people can bend it to their own needs, says a lot for modern progress.


👤 mepian
Honestly I'm more impressed by the early 80s software from the likes of Symbolics and Xerox than by modern mass-market stuff.

👤 chrismcb
By "early 80s computers" if you mean home computers like the C64, Apple II, trs80, ibmpc and Sinclair, then the answer is no. Not really. Today's software is memory and power hungry. Which 80s computers didn't have very much of. It is possible we could have done a better job of switching paradigms earlier. But I think the 80s software was pretty good for what it was.

👤 jerf
I think we could have had some better safety, especially as we get into the later 1980s. We could have eliminated nil earlier. We could have had strings with variable-encoded length prefixes, rather than either NULL-terminated strings or strings with a hard-coded 1 byte or 2 bytes of length. Using a bit smarter type-based approach so that the code always knows what type things are would have been pretty useful.

You might conceivably be able to run something like Go, without concurrency (thus eliminating the runtime), starting around the early 1990s on high-end computers, and that would have been a really, really nice system compared to most other things at the time. I pick Go because of its relative simplicity. A variant of D or something would probably also work well.

A practical more-functional language could probably also have been provided, but some careful work would be necessary. Sum types have some details that we can easily just throw power at nowadays but would take a bit more thought back then to keep performance up enough; you may have to trade down on a few details. Linked lists were not as relatively catastrophic back then as they are on modern systems but they still involved an extra pointer on everything if nothing else, a language more strongly focused on arrays might be able to help.

But it's important not to underestimate the resource poverty of the time. Rust is out of the question. People complain about how slow it is today as it is. The executables modern Rust produces may have run well (with appropriate bittedness changes and such) but the compiler is hopelessly complicated for the 1980s. Nothing even remotely resembling it would have worked. Most modern languages have moved on and are too complicated to fit on machines of that era in any recognizable state.

Modern dynamic scripting languages only began coming online in the late 1990s. Such examples that existed in that space before then were incredibly slow; people pine for the capabilities of the Lisp machines but nobody is sitting here in 2024 pining for their performance characteristics.

I've kind of rewritten your question to be about the 1990s. I think that's because honestly, in the 1980s, no there probably isn't a huge advantage we could give them. Pecking around the edges, sure. Avoiding some things that only hindsight could show were some traps, sure. Commodore could have used some assistance with building a platform that could expand into the future better; for all the advantages that Amiga had over the IBM world, IBMs ultimately expanded into the computers we run today more smoothly than the Amiga could have. There's some tweaks to graphics hardware we could suggest, especially in light of how they were actually used versus the designers thought they would be. But it's not like we'd be going back there with amazing genius techniques for running on their systems that they couldn't have conceived of.


👤 Finnucane
We had WordStar. What more did you really need?

👤 digitalice
deleted