1) We just expected single core performance to double forever every 2 years or so. Many of us were ignorant of single-core performance scaling and the memory wall issues. I want to be clear: Computer Architects were warning people about this in the early 90s but many ignored their inconvenient truth.
2) 2D GUIs would evolve into 3D GUIs - maybe VR? Maybe something else? And the CLI would be gone
3) Low/no code drag-and-drop style programming tools would take away many dev jobs
4) Microsoft taking over and Unix dying
5) All programming would be object-oriented (We are talking 90s style definition here)
IMHO that didn't happen, the integration morass remained, and I'd consider much of that model a failure in retrospect.
On the infrastructure front, I seem to remember a lot of talk about Frame Relay (EDIT: actually I was thinking of ATM, thanks for correction below). And fiber installs all over the place, lots of money in getting fiber into downtown areas, etc. Also I don't think people really predicted the possibility of the dominance of the "cloud" as it is now, really. I mean, hosting services were I'm sure a growth market but "serious" companies were seen as wanting to have their own DCs, and I don't recall major talk or trend away from that until the mid 2000s ("SaaS" etc. became the buzzword around then, too).
Also JavaScript as a serious presentation layer system wasn't much talked about until the 2000s.
We even had a slogan for it: "computers are bicycles for the mind": https://www.brainpickings.org/2011/12/21/steve-jobs-bicycle-...
I was so naive...
Fast-forward today, most of the people I know who were in electronic have switched to software.
No one knew what the internet really was and what it would become. Not Bill Gates, not anyone else.
Developers believed that processing power, RAM, storage etc. would continue to grow exponentially to the point where there would just be too much of it and we wouldn't need to care about any resource constraints when writing code.
Writing code line by line in a text editor was supposedly on its way out, to be replaced by fancy IDEs, WYSIWYG, UML etc.
All the jobs were supposed to go to India. Programming as a profession has always been on the brink of death for one reason or another, and yet here we are.
1. Compiled (native) vs. interpreted (bytecode/VMs). It's this forever cycle of "performance required" that shifts to "portability required", which lacks performance, and gradually shifts back, and repeat.
2. Local processing vs. dumb terminals and shared computing. The current "dumb terminal" phase being the idea that you can buy a $100 Chromebook and working entirely online.
I thought the future was making that publishing easier. I wasn't wrong about that- twitter and facebook are easy. I just missed the consequences of "easy" becoming "walled garden."
We thought everything could be abstracted away into a perfect library and no duplication of functionality should ever need to occur. In the end, we could even drag and drop (or write XML) to connect all those components together.
We thought multitasking was possible and even a good thing to design for. Hence the very noisy and distracting desktop OS designs we still have today.
We thought in terms of a much smaller handful programming languages. And anything “serious” would end up in C (or C++, maaaybe Java)
We thought Apple was kind of a weird thing, surprising company that barely hung on, the computer only shown on movies.
We thought democracy “won” so the future could only be one full of enlightenment, right?
We thought CLIs were so passé and every dev tool needs a GUI.
Merges were done with a diff / merge tool and often manually.
Software libraries were bought for big $ or provided by your OS/compiler vendor, not downloaded. Maybe the “reuse” vision of the past actually WAS realized. Instead of and XML file, it’s my packages.json :).
I expected 250 gigabytes to cost less than $4000 by now. ;-)
https://web.archive.org/web/19990220180309/http://www.tsrcom...
By isolation I don't mean lack of socialization of course. There was plenty of socializing back then. But the people I met on Usenet, BBS, IRC and phpBB forums were people like me. We worked on tech projects, we talked tech, we had our specific jargon and subculture.
I distinctly remember when people asked me what I wanted to do back then, I'd make up some lie, but knew that my future would be living in a single room with no social interaction but with my online friends. (I did not think this to be bad at all, this was my idea of a good life).
I thought tech would evolve to make the internet more into an alternate reality, more separate from the real world. Maybe I was reading too much cyberpunk sci-fi.
In any case, I certainly did not anticipate social media, dating apps and the digitization of traditional brick and mortar businesses. And to be honest, I'm not sure we wouldn't be better off without them.
It seems there aren't that many "pure internet" projects out there. IRC replacements are all tailored for the workplace and real life interaction (maybe except for discord?). It still feels very strange to me that communication platforms would ask your for your real name (especially when, at this point, there's more "reputation" or "social credit" or whatever you'd like to call it attached to my online persona as to my real world one).
Crypto is perhaps the last bastion of such an autarkic technology (i.e. by netizens for netizens) but even that is slowly moving toward real world asset tokenization and institutional integration.
I also thought IDEs would be in 3D, with better visualizations of code running in parallel.
And, I didn't expect web applications to be the norm for most people. I thought we'd have a better "standard" cross-platform application development approach.
One thing that doesn't surprise me are application stores. After seeing how 90s applications used to take over your computer and wedge themselves everywhere, I expected that OS vendors would lock things down a bit more tightly.
While we're reminiscing, the thing I remember the most from the late 90's is that it seemed no one knew what they were doing (or perhaps it was just me). The Internet opened up such a frontier in computing, for effectively hobbyists, as I guess we'll never see again. Perhaps that's why I don't recall thinking about what computing would look like in the 21st century--there was too much to be done right in front of my face!
EDIT: There's a mention or two of thin clients. I think only Sun thought that would work :).
Oh ok, I can think of one that actually worked out: Linux on commodity hardware in data centers. It became clear by 98 or so that the combination of Apache (httpd) on a bunch of cheap Linux boxen behind an expensive LB would tear down the Sun/IBM scale up model fairly easily due to both cost reduction and an improved resiliency. Of course there were MySQL bottlenecks to work through, but then memcache landed and pushed that off a cycle. Then there were the 64-bit AMD chips and a large bump in RAM density that pushed it back another. And then I guess true distributed computing landed.
And I thought that conspiracy theories were harmless fun.
* Thin clients: everybody was going to have a 'dumb terminal', and all the computing/storage would happen elsewhere. There were many variations on this idea.
* PIMs: Palm Pilots and all the variations. I knew a guy who quit his job to develop Palm Pilot wares, claiming it was 'the future'. In a way, he was right.
* Windows: Linux was barely there, Unix was for neckbeards and universities, and it looked like Mac was mostly dead. It looked like Microsoft was about to swallow the world.
* The end of mice and keyboards: Yes, it sounds silly today. but supposedly VR/voice/whatever was going to replace it all.
Components would be “snapped” together by non-programmers to make an application.
I suppose that pieces of this became true with the proliferation and adoption of open source. Rather than a binary protocol, HTTP came out on top.
SharePoint brought us WebParts which tried to put the power in the hands of business users, but it turned out they were still too technical and not flexible enough.
I don’t see the role of software developer/engineer going away anytime soon.
Then the momentum shriveled up with Perl 6 and it has just become completely irrelevant. Pretty insane trajectory. It's almost as bad as like ColdFusion lol
And then we got things that were "good enough" at everything that anybody wanted a PC to do, and we went past that to silly stuff that very few people wanted a computer to do, like CueCat.
And everywhere you looked, there were new things happening. The internet. The web. MUDs. People were just trying things, and just throwing the results out there for the world to look at.
I just kind of expected more and better to keep coming. I didn't see PS/2 and IBM trying to close the ecosystem. I didn't see Windows taking over the world and then Microsoft trying to close the (web) ecosystem. I didn't see the Facebook/Twitter/Youtube silos taking over. I didn't see cancel culture taking over the diversity. I didn't see hostility, nastiness, and hatred taking over the openness and acceptance that used to be there.
Computers failed to create a space dominated by "the better angels of our nature". We not only expected it to, we tasted it. We expected it because it was what we experienced. And then it got overrun by all the worse parts of human nature that we wanted to get away from. Turns out that making people act better takes more than technology.
I expected most people to be able to create and publish their own websites using desktop apps like Microsoft FrontPage or Dreamweaver, and that these types of apps would be the dominant way to create content on the web.
The pace of technology change has accelerated dramatically, the complexity of software has grown beyond imagination, and perhaps correspondingly, the fragility and interdepencency has grown out of control.
We (myself included) have become so acclimated to complex software as users that we forget how much work is buried under the surface of a "simple" app. Even accurately describing the behaviors of a system is difficult now (and often not even documented, partly because it changes so often).
When I was taking flying lessons in the 90s, I couldn't believe how antiquated and manual-intensive the aircraft systems were. There seemed to be so many places where pilots needed to follow a checklist to perform manual tasks which could have been automated based on various available data. When I asked my instructor why it was so archaic, he explained that it's a long and arduous process to get new things certified by the FAA.
This difficulty in getting advanced systems approved was mostly due to the processes and tests required to ensure a very high standard of reliability and safety, since lives were at stake. At the time, I thought it was ridiculous. But seeing some of the Airbus automation problems (which cost a few aircraft and some lives), and then seeing the Boeing 737 Max disasters, I see how slower advancement, more testing, and slower release cycles can be beneficial.
But in the software world, the more modern approach is "move fast and break things". Not only is software now never complete (in contrast to when software was released once or once per n-years, on disk or CD), now it is released every 5 minutes with old bugs replaced with new bugs. There are days where every time you open an app, it needs to update first. I'm not sure this is a net improvement.
If I could say one really positive thing, it would be that tooling has gotten really nice. Even I recall when syntax highlighting arrived... and at first was laughed at. On the other hand, the better tooling is virtually required to keep up with the ballooning complexity and volume of the code we write.
I could rant and complain many more paragraphs, but it can be summarized thusly: modern software (and its development) has made some great improvements, but those improvements have been largely offset by reductions in the quality and formality of practices.
I got it right, but with hindsight it was pretty obvious it was going to be a big thing.
I actually own some of the more speculative non-fiction books on "Cyberspace," in particular the representational aspect. I did expect some higher levels of abstraction to be present when it comes to working with data, browsing file systems, making connections, and so on. I was not necessarily expecting wireframe graphics, but yes, more abstraction. The closest I have seen is something like WinDirStat when it comes to looking at a directory, and even that is just colored blocks. I knew the graphics processing would be intense to match the bandwidth of the human visual channel.
I did see the Wild Wild West era of the Internet coming to a close, having donned white and grey hats myself off and on. The early Internet had security holes such you could drive a bus through them honking and dragging a bunch of rusty bikes behind you. The invention of the img tag more or less began the Advertising Era, whether they knew it or not.
I did have an early suspicion about privacy, and permanence, so my Usenet posts from 1989 never had my name on them. So far this has remained a decent decision.
I did not expect a trend in programming languages I have a hard time describing but can only sum as "yes, every programming paradigm and idiom will be crammed into that language over time if allowed."
I have never expected VR to take off until other technologies are in place and cheap. They aren't here yet.
I have never expected the "Fifth Generation" anyone-can-code drag-and-drop thing to happen and I believe that by the time such a thing is available (and effective at scale), it would be AI anyway.
None of the approaches to AI have been promising to me, aside from Cyc, which I would describe as being necessary, but not sufficient.
I thought tech would be used more for good. The 90s were an incredible era where the internet and the people using it were usually good people interested in changing the world. Now everything is a scam, or collecting information on you. In the end, all internet money flows to advertising which is just sad. It's so over-commercialized and my original hopes have long been dashed.
At least I can run Linux. That makes me happy.
now people watch more static video content than ever and are damaged to a point where it will take a generation or two to recover if it all -- whooops
Programming environments based on something other than editing text files directly. (See the Self, Smalltalk, or Dylan environments.)
Object-oriented programming world domination. (That came true but we’re slowly recovering.)
I thought things would develop along lines that were of interest to me or which appealed to my personal aesthetic of computing (textual). And that viewpoint turned out to be the most myopic possible in terms of predicting what would actually happen.
If I predict something then almost certainly it will not occur.
Personal identification via the internet would be more streamlined - basically government backed digital is cards.
Windows would rule - more or less.
Usenet and IRC would be things.
The big issue was not understanding the importance of ads. Or the impact. We imagined the web would oriented towards machines talking to machines.
But machines don’t consume ads. So APIs are second class citizens.
We also expected a lot more offshoring.
I was quite young at the time writing file sharing and chat network applications, and did a lot of experiments building WANs for doom. Most things I was interested required single multicast and rebroadcasted packets with low ping time over a fully-decentralized network. Also, tiny bits of mobile code that would load dynamically as you use a network application. Binary serialization versus plaintext. Progressive data loading, especially with fractal image formats. This wasn’t really about the future though; it was how things worked. Looking back, technology was mostly not about how to do things better, it was about raw horsepower to do things quickly and poorly. A lot of things changed with ISPs too, because they started throttling uploads and now mine doesn’t even allow open ports. P2P ping time among neighbors is much longer than cloud ping, which is absurd. A lot of this has to do with security, which is a bigger desk than I thought it would be.
So, I was wrong about a lot of stuff.
It was a page of text where some words were underlined and blue. And if you clicked on a blue word, it would jump you to a new page before you were even finished reading the first page!
I remember thinking how dumb that was and must be for people with short attention spans. Uninstall and move on. This WWW thing is going nowhere…
I started with symbolic AI: parsers, tree searches, expert systems, logic. Even then it felt like this approach was stuck in a rut. So I changed lanes and studied early ML: decision trees, genetic algo's, nearest neighbours. When I left college SVM's were all the rage.
All that time, neural networks were deemed interesting but unfeasible for anything realistic. Little did we know.
I feel that if I wanted to get back into this field, I might as well start all over again.
We still build ETL pipelines to process text files into new text files on a single machine like it's 1995. Oh, but it's a virtual machine now. Progress?
And I wouldn't have imagined something as locked up as iOS in my most dystopian nightmares. On the other hand there's Linux and RaspberryPi, which gives me some hope for the future.
I had hoped desktop development would be native compiled RAD tooling similar to Delphi and C++ Builder, the Web would have stayed as interactive documents, all computers would be improved Amiga like experiences and the depency on the CLI would have long been replaced by REPL based environments.
Ah and all modern OSes would be written in memory safe systems programming languages, following the footsteps from Oberon and Modula-3.
- Grey, so much grey. Backgrounds, buttons, shadows
- Marvel at my FrontPage geocities prowess.
- DHTML
- OS/2 is going to be huge! So much power to be able to multi task
- Eventually everyone will have an AOL account
- Eventually everyone will have a hotmail account
- Access will rule the world. Visual Basic. VBA. Gaudy block colorf UIs will be du jour
- OMG Java Applets ruling the world
- VRML will reduce the need to know html
- COM/COM+ is the future of programs transfering data between components
- Buying more machines and spreading the load, rather than bigger machines will be the way of the future
- Novell Groupwise vs MS Mail
- Netscape should just buy Microsoft
- Borland C/C++ was going to rule the world
- Drag and drop UI designers might someday not screw up at the faintest hint of advanced layout (and even might get fast eventually)
- Windows would "Beat" Macs
- Linux will eventually get to be mainstream (desktop)
- ColdFusion will be the fastest / most secure server product
- Having opinions on Lycos / Altavista / Yahoo Google
- Everyone will want to run their own home squid server to make things faster
- someday programs would be big enough to come on multiple disks
- someday programs would be big enough to come on multiple CDs
- Neuromancer / matrix / shadowrun style human computer interaction would come true in the 2000s
- Human computer interactions would tend towards mostly speech / audio rather than qwerty
- jQuery accordions will rule the world
- XHMTL will be the new hotness
- XML will save the world from incompatible data and save so much time
- Source control might someday not screw up XML / UI design files
From the perspective of early-90s, I was all-in on Plan 9 and expected to see a lot more influence from it. Unfortunately that didn't really happen, with a few exceptions (/proc).
Obviously, this happens now, but I think it will become more pervasive and many more programmers will find the pattern incorporated into their daily work stream. I see the act of programming becoming the human guiding the computer to write the programs and the human validates it.
I could be way off. I’d have bet anything in ‘94 that Beck would be a one hit wonder.
If I could tell my young self that as of 2021 we are not talking about kilo- or mega-cores but that my new M1 which I am very excited about is a mere 8, I would have been severely disappointed.
[1] https://www.gamasutra.com/view/news/104365/Feature_The_Trans...
I thought, in the future, everybody has a computer so it will be a part of basic literacy. Most of our generation will be obsolete because we can't stand a chance against people who all had computers in their home at their birth.
Then, the smartphone was invented and most people were just satisfied with that. Now there is even less people who know how to use the keyboard than our generation. No matter how hard you try, you can't beat the efficiency of keyboard by palm-size touch screen input device on writing.
In the late 1990s to 2000s, there were so many personal text web sites, public forums and chats in my language, Japanese. Now it's all gone. There is no place in the modern Internet I can enjoy serious text-based communication in Japanese. Because, everyone use the smartphone now and it's so inefficient to write a serious text with it.
I also didn't anticipated the huge rise of SaaS. A software that depends remote server to function. I was the believer of P2P technology in 2000s. I believed future network will work on top of P2P mesh network. It failed for various reasons but mostly too inefficient to scale.
Then I see the rise of cryptocurrency. It's horrible. Nobody really cares about the technology. People relies on SaaS wallet service rather than hosting full node. It's the worst hybrid of server-client and P2P. Inefficient and still has authority. NFT... it doesn't even host data in the blockchain! TITAN... it's not a stablecoin. It's just a ponzi-scheme-coin.
I hate the smartphone. But right now, it's really hard to avoid programming job that doesn't involve the smartphone. Even if I don't directly need to deal with it, it still indirectly support the smartphone, further helping the spread of inefficient input device for the writing.
I lost all hope for the future right now. It will be even more orwellian world. Most of the computation will be done in the server side and our computer is mere thin-client to the server. Everything is censored and regulated. I wouldn't surprise we lost the ability to write freely.
We had search back in the 90s. Archie, Yahoo, Lycos, Altavista, etc. But Google brought better ... advertising.
What did I expect? I expected hardware to get better. I didn't expect the phone to be so dominant. The iPhone 12 to someone from the 90s might as well be a Cray tricorder.
Hardware is a LOT better now. Dunno if software is that different. JITs are crap; they re-invent the same thing over and over. Linux re-invents what VMS and other commercial operating systems have already done. Most progress has been either hardware or enabled by hardware.
Did not see the workstation companies going away so fast
Disk storage and networking got really, really cheap
Did not expect such fast mobile phones (cpu and bandwidth wise)
The biggest advances have been distributed version control and dependency management systems.
I was expecting Guile to supplant Perl and Python, and for us all to benefit from the greater reach a Lisp could give us. What we got was motherfucking JavaScript as far as the eye can see, and development processes that hobble productivity so that companies can keep hiring programmer-clerks in legion strength and the executives remain unthreatened.
I though VB would grow and mature with building UIs and backing with more robust code, but it just kind of petered out.
If anything iOS storyboards attempted to bring that idea back.
I did not see infrastructure as code being a thing.
Then I would be wrong again, because I would have thought IPv6 would be everywhere...
Interstingly, I absolutely didn't see the web coming. I was very interested in BBS'es and Minitel (France) but I didn't connect the dots until I heard of the Amazon and FaceBook and Google which made it big.
But TBH, I didn't see how transformational was computing. Most of the revolutions that came, I saw them as mere gadgets (including the mobile phone), or funny things to do. It's only very late that I understood the potential.
The only things I saw right were : Linux will eat everything (not because of hindsight, just because I wanted it to be :-)) and I saw that independent gaming would come to life when tools would be cheaper (and I was spot on).
My third prediction was : computing will become invisible and ubiquitous (c'mon one has to carry a computer in its pocket all of the time ? tss...). This has yet to realize :-) Maybe with 5G ? :-)
I was also fortunate enough to learn FP back then and it allowed me to see the future of many programming languages.
But overall, I was "inside" the computer and I didn't realize how much it'd change things and how much it'd change itself.
And I kept arguing that in a decade and a half, there would be a consolidated monolith and individuals would loose power for a long time to come.
Whenever there is a power tool, expect the power hungry to want to control it. My personal motto. Proves right every time.
But I digress.
Looking back on the three decades since I graduated college in '90 (with CompSci as one of my 3 majors), there were three things which I saw as big game-changers, and became both a user and ardent evangelist for: the TCP/IP Internet, Linux, and Free/Open Source.
Today, it's easy to think of these as a given. But in the 90's they were not, with many battles fought, the outcome of which was not always clear, not always successful. Competing protocols, OS', applications and platforms, business interia and ignorance (not in the negative sense, but unaware/uneducated), and commercial companies protecting revenue flow ("No one ever got fired for buying IBM/Microsoft/Novell/Oracle").
I figured that once MLMs could be automated they would replace advertising. Folks would make money connecting each other to products and services through a kind of generic MLM network. Word-of-mouth on steroids. Heck, I still think it could work...
What else?
Flying cars, and flying cities (see Bucky's Cloud Nine), fusion power and atomic tools, robots that could wash dishes and do the laundry, factories that build and delivery whole houses (Bucky again: Dymaxion Home), arcologies (ecologically harmonious mega-cities designed and built as single buildings, Paolo Soleri), Moonbase ("Welcome to Moonbase" by Ben Bova), and those are just the plausible things, linear projections of the capabilities on hand. Not even talking about nanotech, longevity drugs, elctrogravity, etc.
- - - -
It's much easier to think of things that did happen that I never expected. The first major divergence was Twitter. I don't think anyone predicted Twitter. Then there's Facebook. If you could travel in time to, say, the 80's or a littler earlier and you tried to explain FB to people they would think you were stark raving mad. "Put every detail of your life in the hands of some corporation? A voluntary spy network beyond the wildest dreams of the Stasi or Gestapo? Yeah right." But here we are... I never thought JS would still be with us, let alone turd-polished to such a gloss. People spreading lies and misinformation, that's not what the internet was invented for. It was supposed to put the world's information at your finger tips, not be a mighty bullshit engine. "Phones" that excoriate minds, no one predicted that (except the Church of the Subgenius: the "white stone" came as prophesized.)
I thought Java applets (those things you'd embed in a webpage) were hot stuff that would completely reshape the Web someday, but they managed to effectively die before the proprietary Flash. I thought Java as a server language was doomed for performance reasons, but that turned out to be just fine. Basically I completely misunderstood where Java was going.
The expectations around smart home stuff were way off too. E.g. expectations were more along the lines of your house reminding you to go buy toilet paper. What we got was the ability to ask a hockey puck to buy some more when we notice ourselves running low, if you can get past the privacy concerns.
Did anybody have 5 Ghz as the "max" cpu frequency on their bingo cards?
Actually, I remember touring an Intel fab. around 99/00. The guide (an engineer) was pretty sold they were "going to soon max out silicon for CPUs and have to move to germanium" based chips. (Apparently it's too expensive to this day.)
Older, I seem to remember Feynman being pretty optimistic about quantum computers? But that might just have been my takeaway.
First, accessible files would become so numerous that no one would be able to navigate them sensibly with the tools that we had. We would drown in a sea of random gratuitous data.
Second, user interfaces would tend toward a swamp of accidental complexity unless someone made it their business to compete on the basis of superior user experience, based on sound UX research.
Third, unless we could figure out a framework for individuals to own and control their own data, everything would end up owned and controlled by large, faceless institutions with no accountability, and regular people would be reduced to serfs with no effective control of their own data.
I wrote an email about these predictions and circulated it at Apple, trying to get someone interested. I tried to pitch the idea of tackling these problems head-on.
Nobody bit.
Partly that's because I'm not a great salesman. I lack the absolute conviction and the necessary zeal.
Partly it's because I was a minor cog in a large machine.
Partly it's because Apple was in a death spiral at that moment, and didn't have a lot of attention to spare for grand visions of the future.
I had worked for a couple of years on Apple's Newton project, then at NeXT. My expectations of the future arose from stuff I saw while working on those projects. Both Newton and NeXT tried really hard to make user experiences that were natural and comfortable and productive. Both were concerned with trying to connect people using ubiquitous networking.
Both succeeded in no small measure, but both also exposed problems that didn't become visible until you started connecting everything together and trying to design interfaces to help people make sense of it all.
I didn't sell my pitch to anybody, and neither did anybody else. Instead, I think we more or less got the dystopia I was afraid of.
Instead companies created an endless churn of features, rewrote everything because fixing the old problems seemed too difficult (hence inventing endless new problems), hacked new languages for convenience (and to create lock-in) then rediscovered why all that complexity was in the old languages, security was ignored, everyone built new code layered on top of the old stuff we hacked instead of fixing the old code, and everyone switched to commodity hardware creating a race to the bottom driven by low cost.
I've been expecting computing infrastructure to crumble for many years. I think the many modern problems with computing and networks, security in particular but also reliability, indicate it is starting to happen. It will take decades to fix.
Part of the problem seems to be the limited lifespan of people. The old engineers retire and are replaced by the young and a lot of hard won knowledge is lost. Books and the code itself pass on some of that knowledge but it is incomplete (or ignored because who wants to spend the time? We need something new to sell in three months.)
The latest IDE's are certainly doing more of this with code highlighting and autocompletion, but IMHO it could still be easier for coders to more visually annotate and document code.
Much later I discovered things like Ruby on Rails and Django. I switched to emacs and most others switched to editors like vim and VS Code. And things are just generally so much better now. A lot of the things I imagined would be possible back then actually happened.
What I got wrong was how popular the internet would become and what it would be used for. Back then it was mainly a space for geeks. Sure there were a few non-geeks doing internet shopping even back then, but anyone contributing to the internet was more of a geek. There were no girls on the internet. Anonymity was default. While, of course, there was a person behind each handle, it was no concern who they were in meatspace. On the Internet, everyone was equal. Anyone could be anything. Like Neo in the Matrix. That all changed with social media and the big corps decided they wanted to tie us to meatspace IDs. Everyone joined and they brought their meatspace bullshit with them. I didn't see that coming.
We were migrating it to a Sun 690MP running SunOS. So, for the end users of PNMS, we got to try different ways of getting them acccess. DOS/Windows and packet drivers, Sun X-Terminals, Workstations, etc. The DOS and Windows experiences were buggy and generally lousy for the end users. The Sun xterms and workstations were great, but crazy expensive. So I was able to sell the otherwise stodgy management on Linux. Slackware, specifically, in the Linux kernel 0.99-1.x days.
Linux was still buggy at the time, but if you used the right hardware, and rebooted at night, the end users didn't notice much of the issues.
So, after getting it all working, it was interesting to me that a one-man show OS was working better and/or cheaper than what the industry leads could offer. I knew it would go somewhere.
Similar for the web. This was at a US military base, so they had access to the internet. I was given a Linux PC and modem to take home for on-call, so we had internet at the house. I remember showing my wife the very early Pizza Hut website where people in one city in California could order pizza on the site. She didn't get it, perhaps because she's very outgoing and the phone seemed natural. Being more of an introvert, though, I did get it, and assumed the web would be important. Though, I didn't predict how important.
I also made a lot of utilities for end users, and presumed that Tcl/Tk would take over the world. That was after struggling with X11, Athena, etc. That prediction didn't pan out. But Ousterhout later invented RAFT, which is sort of eclipsing Tcl now.
Wish someone had told me, "Startups are fun, but look at how few of them make a buck. Be a dev, but focus on enterprise and public sector services..." As someone who likes making money, probably just as much if not more than solving problems... like happy for successes with startups in my 20s but I don't think it's something anyone should count on. Look for a nice fat enterprise / government contract. Buy yourself nice toys. Don't waste your time on projects that can't self-fund day 1.
Another: There was always this idea that in the future, applications would be built to run on the database, such as with stored procedures. Obviously, this was widely done, and continues to be done, but it never really matured in the way we expected. There is still a big gap in tooling to make this easy.
This is also related to the ideas we had back then about object persistence. One day, we imagined, we would just manipulate objects in memory, and all of the concerns about persisting those changes, including making it performant, would be taken care of.
Also, transactional memory was surely going to be the norm, whether in software or hardware.
I also thought we'd see new paradigms in GUI and desktop design take over, probably involving 3D.
Everything related to how the internet has developed has been both awe inspiring and deeply disappointing.
win32 applications, linux gui applications.
java applets for rich UIs for websites (javascript was a joke).
Like a number of other commenters mentioned, folks like me who were in STEM majors all had to learn how to code in some form or another, though I don't think I ever took any classes as such. A popular joke then: "what do mechanical engineers do when they graduate?" "they learn to code and become programmers"
In my case, I had a part-time job in a lab writing programs in C++ to get various lab instruments to be remote controllable from a terminal and had to learn on the job how to do it. One of the professors in the lab gave me a copy of "Numerical Recipes in C" to help me with the C++ code I was writing. It didn't help at all!
I was also part of the generation that learned basic programming on a TRS-80 (aka "trash-80") in BASIC, then in HS some Pascal and FORTRAN in an informal computer club. My family never owned a home computer, but the school had a bunch of windowsOS machines curious students could play with.
Looking back now, it seems like I was really primed to be part of one of the first waves of professional programmers, and indeed a number of my friends did go that route. But by the time I graduated from university, I kind of hated coding! Recall that in the 90s there was no stack overflow, no google, no way to see how others might have solved the problem. If you couldn't figure out how to get something to work, you had to just grind away at it no matter how long it took.
At the time, I honestly thought whatever future there might be for professional programming I sure didn't want to be a part of it. And I also didn't think it would take over the world like it did. I left uni and I spent a number of years traveling abroad and teaching ESL before dusting off my coding skills and getting a series of coding jobs in various industries.
I am extraordinarily grateful that I learned "the hard way" how to code, but more than once I wish I had gotten my degree in CS instead of ME.
One of the biggest reasons for the miss was I had expected cross-platform app development to become much more mature much more quickly than it actually did. The long persistence of apps that only ran on Windows was something that I didn't expect.
I also didn't expect the fast decline of desktop computers leading to the vendor lock in effect that led to the best laptops tending to either run Windows or Mac OS, further eroding the relevance of Linux not just because Linux didn't ship on those devices but also because it was a second class citizen for drivers.
I also did not expect to be making this post from an Android phone decades later.
We see robots in manufacturing plants (e.g. cars), but not too many visible when you walk down the street. Of course, they're there, but hidden inside various normal looking objects. Automatic street lights, driverless trains, boom gates for public transport.
We got the opposite on all fronts.
When I went to university (2013-16ish) there were still a couple of professors clinging to the idea that everyone would see the light and semantic web would be broadly implemented, but aside from that everyone else I've ever discussed it with has considered it a dead or vapourware technology. I am curious as to whether it was seen as viable or realistic back in the day, or just a pipe dream.
The things I did see were kind of right there. You could see Linux taking shape - and the biggest driver there, IMHO, was that Unix as an OS removed a lot of constraints Windows and macOS had at the time, that Linux was free & open, and that the entire teaching environment used some form of Unix.
You could see that Internet thing take shape. (I'm still cranky with myself I didn't see it enough, so I turned down several excellent startup options). A commercial future was still dubious. At least to me, but then, I'm not a futurist ;)
You could believe, for a glorious moment, that open sharing was an unalloyed good, and privacy wasn't necessary. (That lasted about 18 months, but for a brief moment in time, it felt amazing)
Most of us spent a lot of time making fun of early video sharing attempts, because we could do math. (Bandwidth made it impossible). We were less good in estimating how much bandwidth would increase. Also, the funniest early attempt was Oracle, because Oracle had a reputation even back then.
The one thing I did "see" early on was that single-core was headed to the dustbin. Mostly because I had the fortune to work with a bunch of people who worked on transputers. A dead-end tech by itself, but it made a very clear point that networked systems of cores are extremely powerful.
Looking back, what strikes me most is how little has changed. We had passionate editor wars. We had OS fanboys. We had our own overhyped and underused lazy FP language (Miranda, back then). We still applied functional principles, just not in that language. Lisp was supposed to be cool, but nobody used it. Nobody wrote FORTRAN, but we'd have been doomed without numeric libraries from FORTRAN. C (and figments of C++) were the workhorse for heavy tasks. The lighter lifting, you did in a scripting language (Perl), because ain't nobody got time for C++. GUIs were derided as fat and bloated, and people in general clamored for the good old times of fast and light software. Programs were at the outer limit of complexity we could manage.
Just when I started to think that will never happen, it happened;-)
NCSA Mosaic, meh. That will never catch on.
All of the 1990's "Desktop" software projects that failed horribly. I use way better versions of these today on the web and they actually work -- success! I am still not happy with the NCSA Mosaic user interface.
In the 80s, I dreamed of a 1 million pixel monitor: 1024 x 1024. It's 2021, and I am typing on a 1366x768 latop - big fail.
I knew video game worlds would get bigger and more realistic than the wire renders of the BBC Elite, but I didn’t expect them to be full of loot boxes and other people.
Sure, the web is infinitely more useful today, but I feel like a corporate web slave when using it ... dodging trackers, VPN's myself against XYZ ... etc.
I also thought that RAD tools would continue to stay popular.
Personally I still think that stuff will happen, it's just delayed by people's stupidity/ignorance.
Also discussed was remote storage, globally available file shares.
It looks like this is happening with what we now call "Cloud Computing."
The Internet would be a lot less centralized than it has become with a lot more peer-to-peer traffic.
The Internet would be very disruptive to political systems and large corporations, promoting a lot more small scale DIY and anarchistic type stuff. (Standard issue 90s techno-libertarianism... we did not understand network effects.)
I thought the Internet would decentralize us physically too, making it possible to do the same work from some 'holler in West Virginia as the middle of Manhattan. (I still don't count this one out. COVID pushed us a little further in this direction. This is more a cultural thing than a technical thing, and culture changes slowly.)
C and C++ would be dead. Nobody would write in memory-unsafe languages anymore and garbage collection would be ubiquitous. We'd probably have type systems that caught a lot of bugs too. Rust is finally delivering on the latter, so this may eventually come to pass.
CPU frequencies would continue increasing, or would at least level off at a higher frequency than they have. I would have guessed 10ghz or so since beyond that the speed of light vs the size of the die becomes an issue.
AI would remain very far off. I did not imagine self-driving cars until ~2050 or so. The 90s was still the "AI winter."
Apple would be dead by now.
X86 would be dead by now. I never would have imagined it would have been possible to get this kind of performance out of that hairball of an instruction set.
Spinning disk would be dead. (It is on PCs and mobile devices, but I mean everywhere.)
I didn't predict mobile at all. I would have said such devices are too small to have good UIs. How would you use something with a tiny screen like that for anything but maybe texting?
I thought we'd all have at least a gigabit (full duplex) at home by now.
I was really pessimistic on space, at least in the near-mid term future. I didn't expect to see humans still in space by 2010-2020, let alone something like SpaceX.
I was really pessimistic on energy. By now I would have predicted we'd be well into a global depression brought on by fossil fuel exhaustion. We'd be deploying nuclear fission as fast as possible to keep the grid up, but cars and airplanes would become extremely expensive to operate (thinking ~$20/gal gas). We would have reduced mobility but we'd all be connected.
I wasn't concerned about climate change because of course we would run out of cheap fossil fuels before that would become a really big issue.
... can probably think of more.
Doesn't get much more 90s than Bill Gates
https://www.gatesnotes.com/About-Bill-Gates/The-Road-Ahead-a...
Specifically, I thought languages would be designed around debugging use cases. The fact that we are still using debuggers and loggers and printlns to debug blows my mind.
- Useable VRML (Minority Report was 2002!)
- CPUs that could directly run Java bytecode (Sun never shipped it, but everything had a "powered by Java" sticker on it back then)
I still have dreams about being able to point a cell phone at a physical object (television, fast food restaurant, car etc), and being presented with an AR GUI that controls that device. I suspect this is where things will head with Apple's AR headset. This is not a new idea really... in the mid to late 90s there were specialty AR headsets which could read QR codes and look up schematics and overlay them for you, But it was literally, a 486 that sat on your head.
Other things that blindsided me... The current advances in AI. I worked at an AI lab in the late 1990s. Our pride and joy was a demo that could, given a dataset of 500 images, differentiate between a few classes of images... really simple stuff like, flowers, animals, landscapes, machines. The images were like 48x48 and it took weeks of training on a Sun Ultra workstation.
And this was an interesting lesson for me. To quote a friend of mine who was/is an AI researcher--"These are the same algorithms we used 25 year ago. It's the computation that has changed." He had an idea to make a botany app which could tell what kind of plant what was based on a picture of the leaves-- this was 12 years ago at least. I told him it was impossible because I couldn't fathom how it could work. But he was right of course and now I believe there are such apps.
Back at that university in 2000, I had an array of Sun A1000's-- like 70 disks in total to make 1TB of storage. It cost the university 250k. Today, I have about 100TB of storage at my house personally, and it cost me... 5k maybe? Similarly, my cheap graphics card has more power than every computer on earth before at least the mid 90s.
This is to say, when the SCALE of things change, your capabilities change a lot. This is something that worries me about Google/Apple etc. They have access to oceans of compute power/storage. How could a small company ever compete with that? I would like to get back into AI as I always found it fascinating-- but how can I even learn when access to resources like that cost billions of dollars? I can't even get a decent video card or RAM for my computer at the moment.
We also seem to be at the end of this era of ever-increasing capabilities. We will need a big breakthrough to keep up the pace, or we will have to get much better at using resources.
--
I'm also very surprised by how little what I consider fundamentals of CS matter in the day-to-day. I probably had the last of the classical engineering educations-- full tour of calculus, physics, chemistry, partial tours of like EE, chip design, automata, relational algebra, to mention nothing of the humanities.
When I was a team lead at a large corp, the young folks would be impressed about how I would use an idea from another discipline to solve an engineering problem-- but I think the era of that kind of guy is over. I think what's important now is speed and loyalty. I watch in amusement as the JavaScript community discovers stuff that has been known to the wider world for decades.
Regarding loyalty-- I was raised with the Scientific mindset-- that our commitment was to the truth. A colleague of mine once worked for Sandia National Laboratories and he jokingly suggested that they falsify the results of an experiment-- and he was pulled aside and told something to the effect of, "We are scientists, we do not do that. Don't even joke about it."
I feel like that mindset is absent, even in science let alone business. The most important skill is loathing yourself while pretending that whatever bullshit, alternate version or reality your bosses boss is forcing on you is great.
This is something I could never adapt to. Working 60 hours weeks simply because of bad management. In my corporate gigs, it was always the same-- when a project went well it was because the manager did a good job. When a project went poorly-- it was because engineering fucked it up. I've worked on projects where 9 months out we warned management about XYZ. They ignored it, then blamed us when what we said was going to happen, happened. I think engineers are ill-equipped to defend themselves in these political environments.
In short, I never imagined most of my jov would be going to meetings and lobbying for common sense.
Good grief
I was kinda hoping we'd have ubiquitous flying cars by now, but hey Uber right...lol
I thought that the internet would kill the tabloid press (I am in the UK), and give the public better quality news and information. Instead, the internet became the tabloid.
I thought that it would be the "other guys" who would first leverage the internet as a political platform. People descended from 90s hacktivists, DMT-eating civil libertarians and the like. The rise of internet driven populism was a surprise to me.
I thought that in the future I would own an SGI home computer with superpowered custom chips, or something like a Playstation 2 with an os and keyboard. That is, something highly integrated like the home computers were.
I kind of miss the immersive experience of working with a computer where the whole thing is a unified design and you have a couple of big books which tell you everything you'll need to know to program them, and all the layers of abstraction are accessible and make sense to the programmer-user.
Back then, I thought that the computers of the future would be more profound rather than merely more complex. That they would still be knowable to the lone hacker, only they would demand more from them. Instead we have these huge complexes of ordinary complexity, reminiscent of power station piping diagrams, that we work on in teams as bureaucrats of abstraction. Of course, back then I was a kid, in love with the mystique of computers, there was a lot I romanticized, and a lot I didn't know.
I thought there would be more visual tools for programming, things derived from NeXT Interface Builder, Visual Basic and multimedia authoring systems that people used to use to make CD ROMs and information kiosks in the 90s. And I thought that the development process would be more integrated. We wouldn't be bothering with stacks of batch-mode build tools and things like that, and it would be relatively easy to open a window or play a sound in a high level language without having to do C bindings to third party cross-platform toolkits etc.
I didn't think programming would be replaced with graphical tools, I thought it would be augmented by them, in a unified, ergonomic package.
I thought VR would get going sooner than it did. I was a big fan of Howard Rheingold's 1991 book "Virtual Reality" (a great guide to early VR) and got to try it out in 1993 at a computer exhibition. I used to fantasize about going to VR arcades and playing flight simulators with a bunch of people. Now I do it at home with IL2: Great Battles.
One prediction I got right was shaders on graphics cards. Back in the late 90s I was a user of BMRT, a shareware Renderman-compliant raytracer which introduced me to the idea of programmable shaders. I figured it was only a matter of time before they ended up in consumer hardware, and my ears pricked up when the author of BMRT went to work for Nvidia.