So I'm wondering what such good ideas that have disappeared can other HNers remember.
I'll start:
In the golden age of sun stations, the BIOS was written in forth, and the ROM contained a forth interpreter. Not only all extension cards ROM was interpreted and therefore all extension cards were architecture independent, but you were given a Forth REPL to tinker around the boot process, or in fact at any later point once the system had started with a special key combination.
That was in my opinion way ahead of the modern BIOSes, even taking into account OpenBIOS.
Your turn?
Being a mischievous teenage hacker didn't land you in prison.
Software makers gave a shit about how much RAM and how many CPU cycles they were using. Disk space was sacred.
The technocrats weren't always a given. At least hacker-hippies that resented corporate control gave us an alternative; we could be living in a completely proprietary world. Compilers and even languages used to cost money, I shudder now when I see a proprietary language.
Once upon a time, computers did what you told them to without reporting you to the stasi, Google, or a number of marketing firms. Now that kind of freedom is obscure and hard to access for most people.
Software used to not hide all the options to "protect me from myself".
Computers used to be bigger. I love my office computer, but you gotta admit, fridge size and even room size 1401 style computers are pretty damn cool. I'm planning to buy a fiberglass cooling tower for a big-ish computer for a project this summer...
There used to be killer apps and amazing innovations but now its just ads and single function SaaS leases. At least open source projects are incredible, still. There are a few amazing commercial software products though. It's the future, after all.
Thats enough grumpy ranting for the minute, I'm sure I'll have to append this.
(P.S. remember when computers didn't have an out-of-band management system doing God knows what in the background?)
* Getting information and help was difficult. At best there was c.l.c and such if you had a modem, but there was so much gatekeeping going on that it was hard to get a straight answer on anything.
* Source code was hard to come by. Everyone was so damn bent on keeping their precious code secret that you could only learn best practices if you happened to be in a job with good leadership.
* Hobbyist embedded systems were all but impossible unless you rolled your own. Working on an embedded platform meant using crappy tools, expensive and clunky in-circuit-emulators, and proprietary toolchains. Otherwise it was time for a homebrew etching tank.
* Storage, backups, and code versioning were a problem. Sure, we had CVS and eventually SVN and sourcesafe, but man did they suck!
* Hardware was super expensive. Software was super expensive. Getting anything done on a tight budget required a lot of creative thinking.
* Spending all day squinting at a small monitor sucked.
* Software (especially development environments) was chock full of sharp edges and required arcane knowledge and incantations. They were called UNIX wizards for a reason.
* Data communications and interchange formats were TERRIBLE (and always proprietary)
* Multilingual support was an exercise in madness.
* Bash was one of the nastiest undead languages ever invented. Oh wait, it it still is...
* "If it was hard to write, it should be hard to read and understand" was the mantra of the day.
Today, even electron is reserved for the experienced web dev. See roadmap.sh
I believed we've lost a lot when we transitioned from desktop apps to web and mobile apps and JavaScript won.
Even ActionScript was way ahead of JS during its time!
Programming nowadays feels more like an exercise in importing other people's code correctly. I feel like it's mostly writing something that takes data format X, converts it to format Y for library Z and then 'just works'[sic]. The 'heavy lifting' is usually not happening in your code, but in one of the libraries/full-on programs being pulled in, and your code just happens to be calling it with the right parameters.
This isn't all bad, in that it allows ideas to be tested and products to be created in a testable (and sometimes even shippable) form in a ridiculously fast amount of time, but then you're also far more prone to discover some library in your stack had a breaking change a few versions ago and you suddenly don't know when your upgrade will be delivered because you don't know if it's just a 'legacy=true' parameter that needs to be passed in or they redid their core somehow.
Or you try to profile your code just to find 97% of the execution is from the single 'load_thing_from_internet()' call and you have no idea if you want to fork and maintain a branch of that thing, switch it out for something else, or try to write your own. And you probably have dozens of these in the code you don't even know about because the libraries you import are just the same thing.
I think this whole process makes for sloppy, difficult-to-understand, and slightly scary applications -- and this is basically all applications running today.
It was easy enough to make a website that anyone could. There were services to build a page (Geocities, Angelfire), and if you wanted a bit more control you could host something on a shared server as simply as FTP'ing some HTML files to a remote directory. Expectations were low. People rarely criticised. That meant some truly whacky and creative things got built. Taking payments online was relatively hard work. That meant no one really expected to make much money online. Even ads were only just starting really, and most people didn't bother. People made fan sites for things they were passionate about, just to say they have a website. It was never a "side hustle", it was a just hobby. That was nice.
It was also the era of Flash, which lead to some brilliant and creative sites.
Languages like Perl and PHP were taking hold of server side generation so real SaaS business were starting to take shape as well.
I miss it a little. I have no doubts that the web of today is better, especially given the fact it's the basis of my 25 year (so far) career, but there are definitely aspects of it that I'd bring back if I could. The web should be more fun.
FoxPro
Imagine what all that "low-code" apps pretend to do, but do it better, faster, far more powerful. And in DOS 2.6.
You can do all: Forms, Reports, Apps, utilities, code the database (queries, stored procedures, etc) all with the SAME language and 0 impedance mismatch.
And the Form/Report Builder rival what Delphi do it later.
---
My dream is to resurrect the spirit of it (https://tablam.org). One thing that make this apart from the "low-code" of today: These were tools made for run in YOURS devices, not in a "cloud" where you are at mercy, not only of overlords, but of the latency. This kind of tool feel way faster than moderns because this little thing: Run local is way faster!
So, If I could get the way to dedicate it, this could be the major advantage: Local First, Cloud capable.
- Buying software, especially computer games for Windows 95/98 was a complete gamble. Maybe one third just worked, one third required Direct X/Soundcard Driver/Graphic/Bios fiddling to get something running (though crashes were frequent), one third just outright never worked at all.
- Web Development was an endless nightmare of IE6 compatibility, tables with eight separate pngs around an element to create a drop shadow, clearfixes and floatfixes, polyfills and fallbacks.
- SVN/Turtoise version control was constantly corrupted or in some dodgy state. Different line endings or upper/lowercase filenames could get files stuck and unrecoverable.
- So much (and I mean SOOO MUCH) money was spent on buying and maintaining the serverroom (no cloud, remember) - so you had to buy and amateurishly maintain all this super expensive hardware - usually on a standard, off-the shelf T1 line connection to the internet.
- CD-ROMs got scratched, files got corrupted if your computer crashed during saving
- You spent eight hours a day in front of a giant, non-flatscreen cathode monitor that blasted your eyes with light and radiation, making you look properly stoned when you came home.
... I could go on and on... but honestly, stuff got soo much better over time, especially in tech. Sure, there are downsides (lootbox/micro-transactions in games, expensive software subscriptions for what's essentially static products (I'm looking at you here, my 59$/month Adobe CC abo) etc. - but overall, tech is much, much better today as it was 20 or 30 years ago.
Now - when it comes to social interactions and interpersonal communication, I am less sure...
The availability of documentation enabled the porting of Linux and BSD systems in the 1990s without wasting a lot of time on reverse engineering the hardware details from the original OS.
A lot of other people have mentioned that software wasn't scraping the bottom of the revenue barrel by spying on your every move but I'll say it again because I think it's so important. This has been a huge shift in how software is designed and in the incentive structures behind app development. It has pushed the user far, far down the priority ladder, and I look forward to when this period in software history is over.
Oh, and ironically, software was faster.
Today's consumer products are made with the expectation that they should always "just work". The result is that users reactions to problems have a tendency to range from frustration to panic and rage, rather than an inquisitive curiosity aimed at solving the problem.
I guess you could still tinker with most products, even if some brands make it increasingly hard. The real difference is in the expectations of the users.
Xerox PARC Alto workstation: GUI, TCP networking and Smalltalk 72 OO programming system, all in 1973! This wheel has been reinvented in part many times since by Apple, MS and others. How much real progress has there been in the last ~50 years?
I miss repairable machines. Replacing a keyboard in a modern laptop is all hidden screws, sticky tape and one time use plastic studs you have to reglue because they have to be snapped off to remove the broken keyboard. Phooey to you Acer.
Like others I miss the whole rapid application development movement. VB6 and Delphi just rocked at CRUD apps. Now we got nothing but dependency hell trying to get Electron started after you upgrade some library to pick up a bug fix and it cascades into a complete rebuild. Or trying to remember what dark wizard CLI you use to add a page to angular or react.
I miss being able to catch an exception in VisualWorks, fix the code and hitting continue.
Seriously though, there have been a lot of Golden Ages already and I hope there will be a lot more.
My nostalgia for the time I started out is strong, but when I look at it objectively I see two things I think genuinely were better in the 80's, from a cultural if not an economic perspective:
1) There were many competing hardware and software ecosystems, even paradigms, and it was absolutely not obvious what we'd all be using five or ten years down the road. We live with an impoverished imagination of personal computing now.
2) People working in tech out of pure nerdy love outnumbered the people in it for the money by about 20:1. Now that's reversed, or worse.
[0]: https://www.timexsinclair.com/computers/sinclair-zx80/
[1]: https://www.c64-wiki.com/wiki/Super_Expander_64
[2]: https://en.wikipedia.org/wiki/GEOS_(8-bit_operating_system)
As such, the barriers to start exploring programming on your own were low: the development environment was already there, you were familiar with the interface, and it booted in an instant. I haven't seen any modern day technology replicate that ease of access for beginning coders.
I guess my point is that it was more easily understandable and hackable and it wasn't a 10 GB install if all you need is SOME version of windows to run your games on.
Also zero phone home or mandatory/dark patterned Microsoft accounts.
Single function operating systems.
Cisco's PIX OS ran routers and it did nothing else.
Netware 2 & 3 shared files and printers, and nothing else.
As a result, Netware (for instance) was relatively small and simple enough to understand completely, top to bottom, and the performance was blistering.
When you use the same handful of general purpose OSes for everything, on all hardware, the inevitable result is that they become huge to try to cover every conceivable base. Result: vast OSes which are vastly complex, so much so that no individual can totally understand the whole thing.
A functional OS for one specific specific task should be able to fit into no more than a double-digit number of megabytes of code, and one human should be able to read that entire codebase top to bottom in a comparable number of weeks and understand it completely.
If it can't fit into a normal human's head, then it can't be completely debugged and optimised except by stochastic methods. That's bad.
It's normal now and everyone expects it, but it's still bad.
Edit: also no YAML!
I remember working at an e-commerce system in the late 90s. I knew how Apache worked, I knew which modules I had compiled in, I knew how our proprietary code worked, I knew how Linux worked. Maybe to a lesser extent I knew our our Oracle DB worked in a deep sense, but I knew enough about RMDBs, our PL/SQL, query plans, indexes etc, such that it was sufficient.
Whenever we had an issue, there wasn't part of the system I wasn't deeply familiar with, and I could just in wherever.
I don't even know if that's possible today.
You'd break things, learn, fix them for the time being, until the next thing broke.
Computing today is far less stressful-somehow reminds me of how vehicles today take next-to-zero mechanical knowledge to operate, whereas not all that long ago you needed to know a mechanic and be reasonably handy yourself to keep them running smoothly.
Now things just work. That's broadly better. But I have to wonder how many of us would be in tech if we hadn't tinkered with tech growing up.
Supply-chain attacks are about to destroy our current model of trust in FOSS. Much as the internet was never designed with security in mind, and so has had to have security layered on top of it, badly, our current model of trusting code from the internet is about to get badly broken.
In 10 years' time we'll look back and get all nostalgic about being able to just pull a package off Git__b without worry.
Firstly, you don't really mention when the golden age of tech is for you. The answer can be quite different depending on the technological niche you are thinking of[1]. I'd argue that it's not over yet, and we are probably only at the beginning.
Although I agree that many lessons were learned then forgotten to the past, and we keep re-discovering those.
Only this time open source and free software is a thing, hopefully this will help us build on a common and expanding base, instead of reinventing everything all the time.
[1] especially as tech is so vague. The golden age of siege engines was probably the time of the Roman Empire. We keep building on previous technological bases though, so technology is moving forward, you'd also have to define "golden age".
And your user interface was all in PostScript!
Some tech stacks from the past have returned, e.g. Yahoo Groups was reborn as https://groups.io.
Everything is so ridiculously complex now, and weighed down by so much compliance and need for security (because bad actors are everywhere). You need a whole team to even get started on anything worthwhile.
I dunno, I just liked the early days when we were naive enough to assume malicious actors didn’t exist, and everything somehow worked out alright.
Training was solid and documentation was professional. You had to be on the right side of management to spend the money to get either, though.
Some things were better. Not enough to make it worth changing back, though.
now everything is fake and marketing, empty words and profiling, pure and constant consumerism, and "research" is just a constant justification for the staggering costs of marketing departments that have to sell "new things" to someone
once we were the freaks, instead now everyone is a master of everything while no one knows how anything really works anymore, because layers of layer of layers of crap
...I'm very tired, I'd just like to know nothing more about anything and do something else with the little time I have left (time that I will spend anyway to infiniscrolling empty things)
This kind of empty nostalgia is the epitome of intellectual laziness.