A sense of mastery and adventure permeated everything I did. Over the decades those feelings slowly faded, never to be recaptured. Now I understand nothing about anything. :-)
Starting in 1986 I worked on bespoke firmware (burned into EPROMs) that ran on bespoke embedded hardware.
Some systems were written entirely in assembly language (8085, 6805) and other systems were written mostly in C (68HC11, 68000). Self taught and written entirely by one person (me).
In retrospect, perhaps the best part about it was that even the biggest systems were sufficiently unsophisticated that a single person could wrap their head around all of the hardware and all of the software.
Bugs in production were exceedingly rare. The relative simplicity of the systems was a huge factor, to be sure, but knowing that a bug meant burning new EPROMs made you think twice or thrice before you declared something "done".
Schedules were no less stringent than today; there was constant pressure to finish a product that would make or break the company's revenue for the next quarter, or so the company president/CEO repeatedly told me. :-) Nonetheless, this dinosaur would gladly trade today's "modern" development practices for those good ol' days(tm).
The internet was mostly irrelevant to the line of work I was involved in although it was starting to have impact. We had one ISDN 2x line for the entire office. It was set up to open on demand and time out a few minutes later as it was billed by the minute.
I worked on an OpenGL desktop application for geoscience data visualization running on Irix and Solaris workstations.
The work life balance was great as the hardware limitations prevented any work from home. Once out of the office I was able to go back to my family and my hobbies.
Processes were much lighter with far less security paranoia as cyber attacks weren't a thing. Biggest IT risk was someone installing a virus on a computer from the disk they brought to install Doom shareware.
The small company I worked for did not have the army of product managers, project managers or any similar buffoonery. The geologists told us developers what they needed, we built it and asked if they liked the UI. If they didn't we'd tweak it and run it by them again until they liked it.
In terms of software design, OO and Gang of Four Patterns ruled the day. Everyone had that book on their desks to accompany their copies of Effective C++ and More Effective C++. We took the GoF a little too seriously.
Compensation was worse for me though some of that is a function of my being much more advanced in my career. These days I make about 10x what I made then (not adjusted for inflation). That said, I led a happier life then. Not without anxiety to which I'm very prone but happier.
I was the whole tech staff, work-life balance was reasonable, as everything was done turning normal day-shift hours. There was quite a bit of driving, as ComEd's power plants are scattered across the Northern half of Illinois. I averaged 35,000 miles/year. It was one of the most rewarding times of my life, work wise.
The program was essentially a set of CRUD applications, and I wrote a set of libraries that made it easy to build editors, much in the manner of the then popular DBASE II pc database. Just call with X,Y,Data, and you had a field editor. I did various reports and for the most part it was pretty easy.
The only odd bit was that I needed to do multi-tasking and some text pipelining, so I wrote a cooperative multi-tasker for Turbo Pascal to enable that.
There weren't any grand design principles. I was taught a ton about User Friendliness by Russ Reynolds, the Operations Manager of Will County Generating Station. He'd bring in a person off the floor, explain that he understood this wasn't their job, and that any problems they had were my fault, and give them a set of things to do with the computer.
I quickly learned that you should always have ** PRESS F1 FOR HELP ** on the screen, for example. Russ taught me a ton about having empathy for the users that I carried throughout my career.
The awful part was C++. There were only two popular programming languages: C++ and Visual Basic. Debugging memory leaks, and memory corruption due to stray pointers and so on in C++ was a nightmare. Then Java came out and everything became easy.
The great part was everyone had offices or at least cubicles. No "open floor plan" BS. There was no scrum or daily standup. Weekly status report was all that was needed. There was no way to work when you're not at work (no cell phone, no internet), so there was better work-life balance. Things are definitely much worse now in these regards.
All testing was done by QA engineers, so all developers had to do was write code. Code bases were smaller, and it was easier to learn all there is to learn because there was less to learn back then. You released product every 2.5 years, not twice a week as it is now.
You got the feeling of a thousand developers all running off in different directions, exploring the human and condition and all of the massively cool things this new hammer called "programming" can do.
Compare that to today. Anywhere you go in the industry, it seems like there's already a conference, a video series, consultants, a community, and so on. Many times there are multiple competing groups.
Intellectually, it's much like the difference folks experienced comparing going cross country by automobile in say, 1935 versus 2022. Back then there was a lot of variation and culture. There was also crappy roads and places you couldn't find help. Now it's all strip malls and box stores, with cell service everywhere. It's its own business world, much more than a brave new frontier. Paraphrasing Ralphie in "A Christmas Story", it's all just crummy marketing.
(Of course, the interesting items are those that don't map to my rough analogy. Things like AI, AR/VR, Big Data, and so on. These are usually extremely narrow and at the end of the day, just bit and pieces from the other areas stuck together)
I remember customers asking me if I could do X, figuring out that I could, and looking around and not finding it done anywhere else. I'm sure hundreds, maybe thousands of other devs had similar experiences.
Not so much now.
But what we did have were O'Reilly books! You could tell how senior an engineer was by how many O'Reilly books were on their shelf (and every cubicle had a built in bookshelf to keep said books).
I remember once when our company fired one of the senior engineers. The books were the property of the company, so they were left behind. Us junior engineers descended on his cubicle like vultures, divvying up and trading the books to move to our own shelves.
I still have those books somewhere -- when I got laid off they let me keep them as severance!
Precarious. Very slow. Like a game of Jenga, things made you nervous. Waiting for tapes to rewind, or slowly feeding in a stack of floppies, knowing that one bad sector would ruin the whole enterprise. But that was also excitement. Running a C program that had taken all night to compile was a heart-in-your-mouth moment.
Hands on.
They say beware a computer scientist with a screwdriver. Yes, we had screwdrivers back then. Or rather, developing software also meant a lot of changing cables and moving heavy boxes.
Interpersonal.
Contrary to the stereotype of the "isolated geek" rampant at the time, developing software required extraordinary communication habits, seeking other experts, careful reading, formulating concise questions, and patiently awaiting mailing list replies.
Caring.
Maybe this is what I miss the most. 30 years ago we really, truly believed in what we were doing... making the world a better place.
The biggest change is that software become a high-paying, high-status job. (In parallel, "nerd" became a cool thing to be, maybe for not entirely for different reasons.)
I still remember when the dotcom boom started, and I first saw people claiming to be experts on Web development (and strutting about it) while wearing what I assumed were fashionable eyeglasses. I had never seen that before.
Before software was a high-status job -- although in many areas it was dominated by men (maybe because a lot of women were homemakers, or discouraged from interests that led to software) -- there were a lot of women, and we're only recently getting back to that.
About half my team one place was female, and everyone was very capable. Software products were often done by a single software engineer, teamed up with a marketing product manager and technical documentation, and often with a test technician/contractor. To give you an idea, one woman, who had a math degree, developed the entire product that instrumented embedded systems implemented in C, for code path coverage and timings, and integrated with our CASE system and our in-circuit emulator. Another of my mentors was a woman who worked 9-5, had kids, and previously worked on supercomputing compilers (for what became a division of Cray) before she came to work on reverse-engineering tools for our CASE system.
Another mentor (some random person from the Internet, who responded to my question about grad school, and she ended up coaching me for years), was a woman who'd previously done software work for a utility company. Back when that meant not just painting data entry screens and report formats, but essentially implementing a DBMS and some aspects of an operating system. And she certainly didn't get rich, wielding all those skills, like she could today.
Somewhere in there, software became a go-to job for affluent fratboys, who previously would've gone into other fields, and that coincided with barriers to entry being created. Which barriers -- perhaps not too coincidentally -- focused on where and whether you went to college, rather than what you can do, as well as "culture fit", and rituals not too far removed from hazing. Not that this was all intentional, and maybe it's more about what they knew at the time, but then it took on a life of its own.
Focus was more (again, for me) on doing clever stuff vs, what I do today, integrating stuff and debugging stuff made by other people.
Compensation was lower than it is now (obviously?), but it was, similar to now, higher than most jobs. 30 years ago, unlike 40 years ago, you already could get jobs/work without a degree in software, so it was life changing for many, including me, as I was doing large projects before and in university which gave me stacks of cash while my classmates were serving drinks for minimum wage and tips.
I guess the end result these days is far more impressive in many ways for the time and effort spent, but the road to get there (stand ups, talks, agile, bad libraries/saas services, fast changing eco systems, devops) for me has mostly nothing to do with programming and so I don’t particularly enjoy it anymore. At least not that part.
Process reflected that; our team just got a stack of paper (brief) and that’s it; go away and implement. Then after a while, you brought a cd with the working and tested binary, went through some bug fixes and the next brief was provided.
One of the stark differences I found, at least in my country (NL), is that end 80s, beginning of the 90s, almost all people told me; you program for 5-8 years and then become a manager. Now, even the people who told me that at that time (some became, on eu scale, very prominent managers), tell me to not become a manager.
Realistically, it was way harder. Tools were rough, basically just text editors, compiling was slow and boring, debugging tools super primitive compared to todays. And to learn programming back then, especially making the first steps you had to be really motivated for it. There was no Internet the way we know it today, so you had to dig a lot to get the info. Community was small and closed, especially to complete newbies, so you had to figure out all of the basic steps by yourself and to sort of prove yourself. In the beginning you had to really read books and manuals cover to cover many times - and then again later you'd end up spending hours turning pages, looking for some particular piece of info. It was supper time consuming, I spent nights and nights debugging and figuring how to call some routine or access some hardware.
On the other hand it had that sense of adventure, obtaining some secret knowledge, like some hermetic secret society of nerds that you've suddenly become initiated into, which to high-schooler me was super exciting.
I also feel back then software wasn’t viewed as a well paid career so you had more people who really wanted to do this because they were interested. When I look around today there are a lot of people who don’t really like the job but do it because it pays well.
It was also nice to not have to worry much about security. If it worked somehow, it would as good. No need to worry about about vulnerabilities.
I recall the big disruptors being a transition to GUIs, Windows, and x86 PCs. DOS and command line apps were on the way out. Vendors like Borland offered "RAD" tools to ease the Windows boilerplate. Users were revolting and wanted to "mouse" over type.
The transition from C to C++ was underway. The code I worked on was full of structs and memory pointers. I was eager to port this to classes with a garbage collector, but there were vtable lookup and performance debates.
Ward's Wiki was our community platform to discuss OOP, design patterns, and ultimately where Agile/XP/SCRUM were defined. https://wiki.c2.com/
Work was 100% 9am-5pm Mon-Fri in the office. It was easier to get in the flow after hours, so 2-3 days per week involved working late. With PCs, it was also easier to program and learn at home.
Comp was ok, relative to other careers. I recall by 1995 making $45K per year.
Compensation was average: there were no companies with inflated pay packages, so all my engineering friends were paid about the same. Friendly (or not) rivalries were everywhere: Apple vs IBM vs Next vs Microsoft. I'd grown up imagining Cupertino as a magical place and I finally got to work in the middle of it. After the internet, the get-rich-quick period launched by Netscape's IPO, the business folks took over and it's never been the same.
1) Burn and crash embedded development – burn EPROMs, run until your system reset. Use serial output for debugging. Later, demand your dev board contains EEPROM to speed up this cycle.
2) Tools cost $$$. Cross-compilers weren’t cheap. ICE (in-circuit emulation) for projects with a decent budget.
3) DOS 5.0 was boss! Command line everything with Brief text editor with text windows.
4) Upgrading to a 486dx, with S3 VGA – Wolfenstein 3D never looked so good!
5) The S3 API was easy for 1 person to understand. With a DOS C compiler you could roll your own graphics with decent performance.
6) ThinkPad was the best travel laptop.
7) Sharing the single AMPs cellphone with your travel mates. Service was expensive back then!
8) Simple GANTT charts scheduled everything, +/- 2 weeks.
9) You could understand the new Intel processors – i860 and i960 – with just the manuals Intel provided.
10) C Users Group, Dr Dobbs, Byte, Embedded Systems Journal and other mags kept you informed.
11) Pay wasn’t great as a software dev. No stock options, bonus, or overtime. If your project ran late you worked late! A few years later the dot com boom raised everyone’s wages.
12) Design principles could be as simple as a 1-page spec, or many pages, depending on the customer. Military customers warranted a full waterfall with pseudo-code, and they paid for it.
13) Dev is much easier today as most tools are free, and free info is everywhere. However, system complexity quickly explodes as even “simple” devices support wireless interfaces and web connectivity. In the old days “full-stack” meant an Ethernet port.
- 1 language (DATABASIC) Then it did everything. I still use it now, mostly to connect to other things.
- 1 DBMS (PICK) Then it did everything. I still use it now, mostly as a system of record feeding 50 other systems.
- Dumb Terminals. Almost everything ran on the server. It wasn't as pretty as today, but it did the job, often better. Code was levels of magnitude simpler.
- Communication with others: Phone or poke you head around the corner. No email, texts, Teams, Skype, social media, Slack, Asana, etc., etc., etc 1% of the interruptions.
- Electronic communication: copper or fiber optic. Just worked. No internet or www, but we didn't need what we didn't know we would need someday. So simple back then.
- Project management. Cards on the wall. Then we went to 50 other things. Now we're back to cards on the wall.
- People. Managers (usually) had coded before. Users/customers (usually) had done the job before. Programmers (usually) also acted as Systems Analyst, Business Analyst, Project Manager, Designer, Sys Admin, Tester, Trainer. There were no scrum masters, business owners, etc. It was waterfall and it (usually) worked.
MOST IMPORTANTLY: - 1992, I spent 90% of my time working productively and 10% on overhead.
- 2022, I spend 10% of my time working productively and 90% on overhead.
Because of this last one, most of my contemporaries have retired early to become bartenders or play bingo.
1992 - It was a glorious time to build simple software that got the customer's job done.
2022 - It sucks. Because of all the unnecessary complications, wastes of time, and posers running things.
Most people my age have a countdown clock to Social Security on their desktop. 30 years ago, I never could have imagined such a state would ever exist.
"Hey guys, check out this thing X I made with absolutely no reason other than to see if I could and to understand Y problem better"
replies:
- "idk why anybody would do this when you could use Xify.net's free tier"
- "you could have simply done this with these unix tools and a few shell scripts, but whatever creams your twinkie"
- "just use nix"
Instead what we had were cheers, and comments delayed while people devoured the code to see what hooked you in, and general congratulations- most often this lead to other things and enlightened conversations.
Everything's always been a 'competition' so to speak, but we weren't shoving eachother into the raceway barriers like we do now on the way to the finish line. There was a lot more finishing together.
Apart from that, there were more constraints from limited hardware that forced you to get creative and come up with clever solutions for things that you can now afford to do in a more straightforward (or "naive") manner. It helped that you could (and usually did) pretty much understand most if not all of the libraries you were using. Chances are you had a hand in developing at least some of them in the first place. Fewer frameworks I'd say and fewer layers between your own code and the underlying hardware. Directly hacking the hardware even, like programming graphics card registers to use non-standard display modes. And I'd guess the chance that your colleagues were in it for the magic (i.e. being nerds with a genuine passion for the field) more than for the money were probably better than now, but I wouldn't want to guess by how much.
Oh, and the whole books vs internet thing of course.
Version control was done using the features of the VMS filesystem. I believe that HP MPE had something like that also, but I may have blocked it out.
Around about late '93 early '94 they hauled the HP terminal away and slapped a SparcClassic (or IPX? IPC?) in it's place. I was tapped to be part of the team to start migrating what we could off the VMS system to run on Solaris. So, I had to learn this odd language called See, Sea, umm 'C'?
A whole need set of manuals. A month's salary on books. Then another few books on how to keep that damn Sparc running with any consistency.
Then had to setup CVS. Sure, why not run the CVS server on my workstation!
By the end of '95 I was working mainly on maintaining the Solaris (and soon HP/UX and AIX) boxes then programming.
I still miss writing code with EDT on VMS and hacking away on fun things with FORTRAN. You know, like actually writing CGIs in FORTRAN. But that is another story.
By 1995 I started dabbling with websites, and within a couple of years was working mostly with Perl CGI and some Java, on Windows and Linux/NetBSD.
Most of my work was on Windows, so that limited the available Perl libraries to what would run on ActiveState's Perl.
I gave up trying to do freelance because too many people didn't seem to understand the cost and work involved in writing software:
- One business owner wanted to pay be US $300 to fix some warehouse management software, but he'd up it to $500 if I finished it in one month.
- A guy wanted to turn his sports equipment shop into an e-commerce website, and was forward thinking... except that none of his stock of about 20,000 items was in a database and that he could "only afford to pay minimum wage".
I interviewed with some companies, but these people were clueless. It seems like a lot of people read "Teach yourself Perl in 7 days and make millions" books. The interview questions were basically "Can you program in OOP with Perl?".
I got a proper developer job on a team, eventually. They were basically happy that I could write a simple form that queried stuff from a database.
Some other people on my team used Visual Basic and VBScript but I avoided that like the plague. I recall we had some specialized devices that had their own embedded versions of BASIC that we had to use.
When Internet Explorer 4 came out that we started having problems making web sites that worked well on both.
Web frameworks didn't exist yet, JavaScript was primitive and not very useful. Python didn't seem to be a practical option at the time.
These days often the demand is more horizontal. Stringing together shallow understood frameworks, libraries and googled code and get it to work by running and debugging.
The scope of things you can build solo these days is many orders of magnitude larger than it was back then.
Still, the type of brainwork required back in the day most definetly was more satisfying, maybe because you had more control and ownership of all that went into the product.
We were also in the "object nirvana" phase. Objects were going to solve all problems and lead us to a world of seamless reusable software. Reusability was a big thing because of the cost. Short answer: they didn't.
Finally I am astonished that I'm routinely using the same tools I was using 30 years ago, Vim and the Unix terminal. Not because I'm stuck in my ways, it because it is still state of the art. Go figure.
I'd never go back. The 90's kind of sucked for software. Agile, Git, Open source, and fast cheap computers have turned things around. We can spend more time writing code and less time writing process documents. Writing software has always been fun for me.
Process at these companies was slow and waterfall. Once worked with a group porting mainframe office type software to a minicomputer sold to banks; they had been cranking out C code for years and scheduled to finish in a few more years, and the developers were generally convinced that nothing would ever be releasable.
The people were smart and interesting - there was no notion of doing software to become rich, pre-SGI and pre Netscape, and they all were people who shared a love of solving puzzles and a wonder that one could earn a living solving puzzles.
IBM had a globe spanning but internal message board sort of thing that was amazing, conversations on neurology and AI and all kinds of stuff with experts all over the world.
I also worked at the Duke CS lab around 1990, but it was hard to compare to companies because academia is generally different. People did the hard and complex stuff that they were capable of, and the grad students operated under the thumb of their advisor.
Wages were higher than for example secretarial jobs, but not life altering for anyone, but people didn’t care so much.
Our office was in a Northwestern University business incubator building and our neighbors were a bunch of other tech startups. We'd get together once a month (or was it every week?) to have a drink and nerd out, talking about new technology while Akira played in the background.
It was awesome! I got to write extremely cool AI code and learn how to use cutting edge Mac OS APIs like QuickDraw 3D, Speech Recognition, and Speech Synthesis. Tech was very exciting, especially as the web took off. The company grew and changed I made great friends over the next 7 years.
(Almost 30 years later I still get to write extremely cool AI code, learn new cutting edge technologies, find tech very exciting, and work with great people.)
The DOS/Windows stack is what I worked on then. Still using floppies for backup. Pre-standard C++, Win16, Win32s if it helped. Good design was good naming and comments. I was an intern, so comp isn't useful data here.
Yes, things are much better than then. While there were roots of modernism back then, they were in the ivory towers / Really Important People areas. Us leeches on the bottom of the stack associated all the "Software Engineering" stuff with expensive tools we couldn't afford. Now version control and test frameworks/tools are assumed.
Processes didn't have many good names, but you had the same variation of some people took process didactically, some took it as a toolbox, some took it as useless bureaucracy.
The web wasn't a big resource yet. Instead the bookstores were much more plentiful and rich in these areas. Racks and racks of technical books at your Borders or (to a lesser degree) Barnes & Noble. Some compiler packages were quite heavy because of the amount of printed documentation that came with them.
Instead of open source as we have it now, you'd go to your warehouse computer store (e.g. "Soft Warehouse" now called Micro Center) and buy a few cheap CD-ROMs with tons of random stuff on them. Fonts, Linux distros, whatever.
I had spent a year or two working on a knowledge-based machine-control application that Apple used to test prerelease system software for application-compatibility problems. A colleague wrote it in Common Lisp, building it on top of a frame language (https://en.wikipedia.org/wiki/Frame_(artificial_intelligence...) written by another colleague.
The application did its job, using knowledge-based automation to significantly amplify the reach and effectiveness of a small number of human operators, and finding and reporting thousands of issues. Despite that success, the group that we belonged to underwent some unrelated organizational turmoil that resulted in the project being canceled.
I interviewed with a software startup in San Diego that was working on a publishing app for the NeXT platform. I had bought a NeXT machine for learning and pleasure, and had been tinkering with software development on it. A bit later, another developer from the same organization at Apple left and founded his own small startup, and, knowing that I had a NeXT cube and was writing hobby projects on it, he contracted with me to deliver a small productivity app on NeXT. In the course of that work I somehow found out about the San Diego group and started corresponding with them. They invited me to interview for a job with them.
I liked San Diego, and I really liked the guys at the startup, and was very close to packing up and moving down there, but then the Newton group at Apple approached me to work for them. The Pages deal was better financially, and, as I say, I really liked the people there, but in the end I couldn't pass up the chance to work on a wild new hardware-software platform from Apple.
In the end, Newton was not a great success, of course, but it was still among the most fulfilling work I've ever done. Before I was done, I had the opportunity to work on a team with brilliant and well-known programmers and computer scientists on promising and novel ideas and see many of them brought to life.
On the other hand, I also overworked myself terribly, seduced by my own hopes and dreams, and the ridiculous lengths that I went to may have contributed to serious health problems that took me out of the workforce for a couple of years a decade later.
But 1992 was a great year, and one that I look back on with great fondness.
30 years ago the production values on software were a lot lower, so that a single programmer could easily make something that fit in with professional stuff. My first video game (written in turbo pascal) didn't look significantly worse than games that were only a few years old at the time. I can't imagine a single self-taught programmer of my talent-level making something that could be mistaken for a 2018 AAA game today.
The other major difference (that others have mentioned) is information. It was much harder to find information, but what you did end up finding was (if not out-of-date) of much higher quality than you are likely to get from the first page of google today. I can't say if it's better or worse; as an experienced programmer, I like what we have today since I can sift through the nonsense fairly easily, but I could imagine a younger version of me credulously being led down many wrong paths.
I loved having the 6 white "Inside Macintosh" volumes and being able to sit on a couch and read them. I loved that, if you paid to be in the Apple Dev Program, you could get your questions answered by actual Apple engineers working on the bowels of the product you were interfacing to. (We were doing some fairly deep TrueType work in support of the System 7 launch and just afterwards.)
What sucked was there was no web in any practical sense. Code sharing was vastly more limited (CPAN would still be 3 years away). Linux 0.99 wasn't out yet. CVS was state of the art and much collaboration was by emailing patches around. What you were building on was what the OS and proprietary framework/compiler gave you and you were building almost everything else from scratch. Expectations from users were lower, but the effort to reach them was still quite high. Software that I worked on was sold in shrink-wrap boxes in computer stores for high prices. Therefore, they sold few units.
Compensation was a mixed bag. I was making more than my friends in other engineering fields, but not by a lot. The multiplier is significantly higher now in tech.
On the plus side, I was shipping code to go into a box (or be ftp'd from our server), so I can't ever recall being paged (I didn't even ever carry a pager until 1997 and working in online financial services) or bothered when not at work.
I think the world is better today: much easier to build on the shoulders of others, much easier to deliver something MVP to customers, get feedback, and iterate quickly, much easier to A/B test, much easier to advertise to users, much more resources available (though some are crap) online, and the value you can create per year is much higher, leading to higher satisfaction and higher compensation.
I’ve thought about this a lot because I grew up with an Apple IIc in my house, but didn’t learn to program C until 2001. My parents didn’t know how to program. We learned to use the computer for typing and desktop publishing. Programming books were expensive. Even the commercially used compilers and IDEs were expensive. Mac developers paid for CodeWarrior [1]. I don’t remember source code being available easily. Aside from “hackers” who have a knack for figuring out code, there wasn’t really a path for “read the docs” people to learn unless they lived near a place with lots of physically printed docs (a company, a university, a parent who programmed). Disk space was a constraint. Programs were split across multiple floppy disks. Computers would run out of space. The length of variable names & things like that mattered because shortening them let more code fit on a disk. I’m not old enough to know how much that affected full computers, but it made a huge difference on a TI-82 graphing calculator. That was the first thing I learned to program & cutting lines of code left room for games. I assume a lot of bad habits that make code hard to read came from that time. Oh… there was no source control. Revisions were just named differently or in a different directory, if they existed at all. And… grand finale… project plans included days or weeks to print disks and paper user manuals so apps could be sold in stores :-)
It was a mixture of VGA CRTs and VT220's. I just missed token ring,
New graduate, started in Australia at $35k. That's about $65k in current dollars. No stock, maybe a bonus.
Manuals were binders or printed books. Builds were managed by the "Configuration Manager", and coordinated nightly. The languages were C, Ada, Assembly.
The network was BNC cables.
Design reviews were a thing, code walk throughs were a thing. People were trying to work out how to apply UML.
Printers will still a mix of dot matrix and lasers.
Design patterns I think, had just become a thing.
Work life balance was okay. Everything was waterfall, so you ended up in death marches semi-regularly.
Linux was just ascending, but big-iron Unixen ruled the dev environment. Microsoft was just trying to work out itself (Microsoft Mail, Lotus 123, Domino)
Mainframes and business computing was much the same as it is now. Struggling to get to the bottom of requirements, lots of legacy code, significant dev/test/production handover. In the early 90s business was going in the direction of 4GLs (fourth generation languages), which were a more model-driven approach to development. Something that wasn't really abandoned until the mid 00s. There also would have been a lot of off-site training and formal courses. With little more than the formal documentation, training facilities were big business. People were also specialised - so it would be common to fly in a specialist (highly trained) person from IBM to come and do something for you.
PC-based software was great because it was simple. Early 90s were just getting into file-shared networking, so not even client-server. Security wasn't an issue. Applications were simple CRUD apps that made a big difference to the businesses using them, so simple apps had high value. In the 90s, just being able to print something out was a big deal. App environments were simple too. You could have one or two reference books (usually the ones that came with the product) and you'd be fine. You could be a master in one environment/tool with one book and some practice.
Embedded software was a nightmare, and required expensive dev kits and other debugging scopes and hardware. Arduino is only from mid 00s, so the early 90s were highly specialised.
Networking and comms were also in their infancy early 90s, so anyone working in that area had it tough (Ethernet and token ring were still competing). Although the networking people of the day forgot to put in a bunch of security stuff that we are still trying to make up for today.
Not much different to today then. Some boring enterprise stuff, and some exciting new stuff - with all the differences in perks and remuneration.
It was you and the black box. Manuals helped a bit. Books helped a bit. But largely it was you and the stupid box.
Windows 3.11 was all the range.
Mobile development wasn't a thing.
Networking personal computers was a big deal. There were several architectures up for grabs, and several major players (including Netware) in the mix. It wasn't clear which path developers should follow.
The internet (on personal computers anyway) wasn't really a thing yet.
On Windows, the Component Object Model had just been shipped, and it was all the rage with Windows developers.
Keeping source code in version control was a novel idea. (Mercurial and git were 10+ years away)
Many things haven't changed though. It was important to be current on technology then as it is now.
It continues to be important to be able to communicate well. Software is a domain of the mind because we take ideas in our head and translate them to bits. You need to be able to have a mental model of the problem space and communicate it clearly with peers and clients.
Usenet was 1995's Stack Overflow. You'd get good answers and perhaps a lecture on not using usenet for commercial purposes or to cheat on CS homework assignment.
But, the work was very interesting and much more fun compared to today. There were many more different Operating Systems to work with and almost everything was custom code. Now, seems all we do is deal with purchased ERP systems and lots of bureaucracy when issues occur. 30 years ago, we could make changes directly on production systems in emergency situations, now, never.
(edit fixed spelling)
When I started professionally 15 years later I worked at a desk in an office cubicle. I wore a shirt and tie and kept regular "office hours." Our team worked in C++ on Windows as part of a vanguard effort to adopt object-oriented techniques at the bank. The design processes and project management techniques however were still very waterfall. We had lots of meetings with stakeholders and project managers. We made and revised lots of estimates and the PM used lots of complicated charts and timelines to keep track of it all. The Internet and the web were still a couple of years off and all our work was "client/server" which mostly meant thick clients talking to SQL Server, manipulating the data and then writing it back. Feature requests, code and db schemas tended to balloon and estimates were never met unless they were for very small and discrete changes.
I'm still working in the business today, though now I do cloud infrastructure and systems engineering and use linux. Obviously so much has changed that it's difficult to wrap it up narratively but if I had to pick one dramatic thing it would be access to information. It's hard for me to imagine myself back in those pre-web days even though I lived through them. I am fairly certain I could not do my job today without Google search and other online assets, and yet somehow we made do 30 years ago with actual printed manuals that you had to buy. I paid almost $150 for an IBM manual describing the VGA adapter port/IRQ mapping so I could write some mode X effects. :). I still have it somewhere down in the basement. When Microsoft first launched MSDN and made info available on CDs it was a revolution for me professionally. I suspect engineers who have grown up with all the technical info in the world at their fingertips would definitely feel like 1990 was the dark ages.
After grad school my first job involved building the equivalent of the WOPR (WarGames ref). The AI was written in lisp, the interface was written in C++ using Interviews/X-Windows, all the simulations were written in Fortran and ADA was used to glue all the pieces together. Except for the simulation code it was all written by a team of 3 of us.
Greenfield projects were truly greenfield. You had to invent almost everything you needed. If you wanted a dictionary data structure you built it yourself. Over the network communication meant having to develop your own TCP/UDP protocols. Imagine doing leetcode problems in C++ without any libraries or templates.
Memory bugs were the bane of your existence. There were almost no tooling for debugging other than gdb. Testing was very manual and regressions were common. Source control was originally at the file level with SCCS/RCS and when CVS came out it was the greatest thing since sliced bread.
Death marches were sometimes a thing. I remember one 6 week period of 90 hours a week. While it wasn't healthy, it was easy because we were exploring new frontiers and every day was a dopamine rush. You had to fight with yourself to maintain a decent WLB.
Like now, there were always battles around what new technology would be the winner. For example in networking we had Ethernet, token ring, fddi, appletalk, netware and a few others all vying to become the standard.
Working from home meant dialup into a command line environment. So every developer knew how to use vi or emacs or both.
The biggest difference today is you stand on top of all this technology and processes which have matured over the years. This means a developer today is MUCH more powerful. But if they didn't live through it, much of that mature technology and processes are just a given without deep understanding, which most of the time is fine, but once in while can be a problem.
Everything else I'll say was better 30 years ago. Despite the lower salaries, programming was a much more respected profession. Today programmers are only seen as unthinking cogs in an agile sprint, where PMs run the show.
Quality was much higher, mostly as a byproduct of engineering being driven by engineering, not PMs. You got to own all the technical and most product direction decisions. Only occasionally someone from sales would come in and say Big Customer wants such feature, let's do it.
Work-life balance was generally much better since you planned for the long haul, releases maybe yearly instead of permanent sprinting without a long term plan as today with agile.
There were fewer outlets for information but they were of high quality. So, the "signal to noise ratio" was significantly better.
As others have mentioned the systems were simpler and therefore more understandable. This meant that people's creativity really came out. Look up copy protection or how people cranked up the speed of the disk system on the C64 or Atari or Apple.
Tools cost money - free compilers or assemblers were few and far between. The syntax and usage was simpler and compile times were quite low.
There was no memory protection, so, the application you were development could easily take down your development system.
I interviewed candidates for a sw position who fronted with serious lisp experience on live deployment: traffic lights control systems. (We wanted C. But it stuck in the mind)
Compile-Edit cycles could leave you time for lunch.
SCCS was still in use, RCS was just better.
You had to understand byte/short/word/longword behaviours in your compiler. Unsigned was tricky sometimes.
FP error was common. Not all the bugs were ironed out of libraries (NAG aside. They were really reductionist)
Use of global variables was not yet entirely anathema
The CPP could run out of #defines still.
Pdp11 were getting more uncommon but not dead. VAX were common. Sun's were 68000 mostly.
There was a gulf between IBM and their seven dwarves and everyone else. UNIX was not quite ubiquitous off campus but becoming so.
What online resources that existed for programming at that time were few and far between and much harder to find help on. Programming videos were almost non-existent.
Spending time in a bookstore and trying to find a good book to solve your problem was about the best strategy you had in many cases.
Today, technical books still exist in the remaining bookstores of course, but the proportion of genuinely interesting books is much worse. Technical shelves are now filled with garbage mass market books like how to use your iPhone and the like.
Those were my college years. Prior to that, in the mid-80s I developed in Forth / 6502 assembly on my Commodore 128 at home, and QBASIC (sigh) on IBM PCs at work. After college, we were programming in C++ on Gateway PC clones -- had one at work, and a slightly less powerful one at home. A helpful co-worker introduced me to Perl, which quickly became an essential tool for me. Around this time we finally had version control software, but it was terrible.
When I went professional in 1998, the .com boom was underway. It was a wild time full of challenging work and exponential rewards. The hours were long, the pay was crap, and the value of programming wasn't fully acknowledged.
Compare that to today. I have banker hours. The pay is great. My job appreciates me. However, my coworkers are all 20 years younger than I am and make similar or more pay with half my experience. But, experience beyond 5 years in this industry doesn't matter because it's irrelevant. Only the mention of huge company's names in my resume are worth something beyond 5 years.
Architecture and design is a little bit more defined now, and there's a long history of what works and what doesn't, if you have enough experience to look for it. The actual SDLC and workflows are a joke today, everyone just pays them lip service and most don't even understand why they do them at all. We struggle more today with local dev environments because they've become over complicated. Adjusted for inflation we are making $50K-$100K less today than we did in 1998. We still have devs who don't want to know about how their app runs in production, leading to the same bugs and inability to troubleshoot. Apps are getting larger and larger with more teams that don't understand how it all works.
There's a lot more software out there today, and a lot more information (much of it not good quality) so there is more that can be done easily, but it needs to be managed properly and people struggle with that. Security is, amazingly, probably as bad as it used to be, just the attacks have changed.
There doesn't seem to be real training for the neophytes and they're getting worse and worse at understanding the basics. You can be called a software developer today by only knowing JavaScript, which would have been crazy back then. But now that I say that, perhaps it's comparable to PHP back then.
I was to port some C code implementing TIFF into their graphics library. This was before the WWW took off (or known to me). They had Internet (with a host naming scheme incorporating building and floor, so byzantine that we found it easier to remember the dotted-quad IP address), but I don't recall for what or to which extend it was used. So a good part of the job was to read printed documentation.
vi was used in that group as text editor and I was expected to do the same. I hadn't seen it before and when given the two page cheat-sheet, I thought it was worse than WordStar in (already long obsolete) CP/M. Learned the basics quick enough though and even grew to appreciate it when using a terminal connected via 9600baud RS232 ...
We enjoyed flexible working hours, with a core time from 11am to 2pm where people were expected to be in the office (in order to ease scheduling of meetings, of which I recall none; we met at lunch in the cafeteria though). I had to leave before 8pm though, as then the porter would lock up the building.
Also: way fewer libraries. You might write ALL of an application. You might call the OS to just to read/write files. Today is much more gluing together libraries, which is nowhere near as much fun.
Nonexistent, nonexistent, nonexistent.
Hope this helps :P
To detail, thirty years ago it was more or less clear waterfall is a dead end but it was not yet clear what to do instead. UML kind of redeemed waterfall but it didn't exist yet.
As for work-life balance, time crunches at some software companies became legendary when shipping time meant shipping time so people slept in the office and crazy stuff like that. At this time there were still one man projects though but that era was coming to the end.
There was no internet and we had no books but somehow I figured out the 'man' and 'apropos' commands. From there I read all the section 1 man pages, experimented with the things I found and basically figured out how unix works. Within a couple of months I had a suite of shell scripts doing most of the regular maintenance work for me.
A few months later on a Friday afternoon one of our legacy systems, the one for processing cheques, died (literal smoke coming out of the back of it). My colleague and I had been learning C in our spare time so we volunteered to rewrite it over the weekend. We had no sleep that weekend, but we delivered a working replacement (using ncurses) by Monday lunchtime.
It was a simpler time and a more charming time. The internet has been a game changer, both good and bad. It's easier to learn new things now but there are a lot more new things to learn.
Work-life balance is a hard one for me to answer because my situation has changed. Back then I enjoyed being at work more than I enjoyed being at home so I worked super long hours, I even slept in the office fairly regularly (mostly because of the pub across the street). So there was no work-life balance, but I liked it that way.
There were crazy entrants like Omnis 3/Quartz which had you select statements instead of typing them! So you would pick out "if/then" and then fill out the expression.
Anything you did provided incredible value, so that people were really happy. I was getting into VB right about that time (roughly) and you could build apps so quickly -- and not to replace a solution, but to create the first solution.
And to reiterate what someone else said, I had my own large office in a skyscraper with an incredible view and was treated (not just paid) like a top professional.
DOS was still popular because it was so much cheaper than a workstation. Single-tasking means exiting your editor to do code check-ins and having to reload your context a lot. Networking was definitely an after-thought that many didn't have access to.
The bigger issue was the device you were programming. Small memory. Pointer sizes that might not be what you expect. Integers that overflow if you fail to think about their sizes. Pointers that cannot reliably be compared without normalizing them first. No threads. An operating system that does nothing other than provide access to files.
I've used C++ for my entire career. CFront 1.2 was a very different language than modern C++. Sometimes people wonder why I use stdio.h instead of iostreams: it's because stdio.h is basically unchanged since cfront and that code still works while and iostreams have changed significantly multiple times requiring re-writes.
One thing I miss is some of the color schemes in the Turbo Pascal/C IDEs. Yellow on Magenta and Yellow on Cyan were fantastic. I'm not sure why those color schemes don't work in GUIs -- I suspect it's because the geometry of fonts is so much different than text-mode graphics. Text mode had super thicc and clear fonts made out of so very few pixels.
Software architecture was a thing - you were given the full responsibility for your component. Something to take seriously. And something you could take pride in, when your process was the first one to run 24/7 without crashing or memory leaks...
Work-life balance was worse for me because I spent lots of "free" time learning stuff to apply at work. Now I'm trying to do things I enjoy. Coding isn't among those things any more - Scrum and its ilk took all the fun out of that.
No concept of unit testing, integration testing or CI - customer support gave things a quick look over and the program got sent out so we always scheduled a 2 week Bug Blitz to deal with all the issues that the customers found.
Small company, salary was good, regular hours and a challenging environment with all the rapid tech changes
My environment was almost identical to what I have now. A *nix box with a big monitor (Then: DEC Alpha 600 running OSF/1 with a CRT, using ctwm, rather AMD Threadripper running FreeBSD and an LCD with lxde). All my dev work then and now was done in emacs, driven by Makefiles and built/executed/debugged from a terminal window and managed by a revision control system (SCCS/CVS then, git now). Honestly, most of my dot files haven't changed that much in 30 years.
My compensation is far better now, mostly because I'm senior and work for a FAANG rather than a University.
It was a wild year or two. If you could spell C++ or Java, you were employable at salaries well above average. Peak dot-com - the web was going to solve everything and make us all rich at the same time.
Then March 2000 hit and it all fell apart. And quickly.
Anyways, salaries were high. Not current SV/Seattle high, but high enough. IIRC, $50-$60k was a common range for new graduates. Equity was there, but again, not to the same levels as today's unicorns or FAANGs (AOL and a few others being the exceptions).
Work-life balance was similar. Lots of start-ups with none at all, but lots of beer and ping-pong. Mature companies (IBM and government contractors) were a bit better. Microsoft somewhere in the middle.
Waterfall was very much a thing at the Lockheeds of the world. Smaller companies were less rigid, but "agile" wasn't yet an industry buzzword (Agile Manifesto was 2001).
IDEs and tooling were nowhere near as efficient and helpful. Lots more plain old editing in vi or emacs or whatever. Compiling code was a lot more manual - makefiles and such at the command prompt. Version control was 100% manual via CVS or similar.
Better? If you got into AOL early, yeah, because you were rich. For the rest of us, it wasn't better or worse, but it was good then and it's good now.
Basically, desktops were where it was at. If you were a programmer, you were almost certainly creating software for desktop computers or mainframes. But, 30 years ago was even before Windows 95. It would have been around the era of Windows 3.1.
I worked on AS/400 accounting software. (I think my boss stole a copy from somewhere else that ran on System/36. ) We supplied this to a few dozen mid sized, blue collar companies in town. (One I remember sold gas station supplies)
Programming would be done on site. Customers would want a new report, or a new field added to their system. I would come on site and code it for them. I guess in a way it was very agile. But I had no team, just me. And once in a while my boss you stop in to see how it was going, and help when I got stuck.
Lots of good Unix work came from that time which fell into place when Linux appeared. Things ported from there to Linux pretty painlessly which greatly reduced the cost of hardware. And that World Wide Web thingy that started to appear was pretty neat even at 19200.
- I worked on C code, it was nicely logically divided into libraries and folders and you could build one folder at a time to save time.
- I was still young and not exposed to processes but there were signs (paper signs) in the corridors about RAD (Rapid application development) and QA was a separate department, only my manager talked to them.
- Compensation was rather good and very few years afterwards it become even better
- WLB was non existent, but again I was young and didn't care
Things were simpler, I knew the code down to which bits the CPU flipped, debuggers used primitive DOS GUIs and source control was something we considered starting using.
What's interesting is that there were still fads of the week. The main product I worked on (a cash register system) had originally been generated by a code generation tool that someone had sold the company as a way to reduce their programming costs. Its main event loop was something like 50 screens long (though in their defense, screen were only about 800x600 pixels then).
We had internet, but it was different than today. Instead of HackerNews, we had Usenet News. It was a pretty good system for what it was. I had a specialized reader just for Usenet. There was IRC and AOL for chat. There was no music or video, though. Images just barely worked.
Because I worked for a large company, we had source control, but it was something written in-house and was pretty bad. (But better than nothing.) We wrote in Visual C++, which was not visual at all in the way that Visual Basic was.
In terms of simpler systems, one thing I just remembered was that because I worked for a large corporation, we had a hardware analyzer that we could use to see the exact instructions the CPU was running in order to debug really hard problems. I only used it once or twice, but it worked really well. (I mean the UI was terrible, but you could get some great info out of it.) I think someone said the device cost like $50,000.
No emails, no smartphones. Support people had pagers. But you weren't expected to fix world-serving infrastructure remotely. If it failed during the weekend, you'd most likely fix it on Monday. Unless something bad* happened.
There was no Internet, unless you were in academia. Knowledge came from books, magazines and from what you remembered if you didn't have these with you. Having a good network (the human kind) was essential. Although some hermit types made it very far alone... some descended into madness.
There was no Linux. A lot of software was expensive. Most software you could pirate if you knew the right people or had an account on the right BBS. "Shareware" was a where value was. You knew nobody would come after you if you never posted a paper check to the guy who made it to get a license.
Most viruses were not subtle. They would piss you off and trash your stuff. You had to reformat. You learned quick to have copies.
On the "sneakernet", floppies were king. Backpacks full of plastic. Systematic copy sessions between friends. Swapping sequenced media had a rhythm of it's own. A good excuse for drinking.
Hardware was expensive and evolved fast. That new 486 was _impressive_. People gathered around it to see it compile and do "3D" TWICE as fast as the fastest 386 before it. Moore's law in full effect. If you had the latest shit, you could boast about it with _numbers_. Not just some new color.
I knew people who printed pieces of code to study on the bus. Printers were way more important than they are now. They still haven't died and that's too bad. I hate printers.
*I was the guy with the pager that had to leave the bar at 2AM to feed printers. Fuck printers.
The main thing was, for me, that very, very few of us actually had any CS/SWE training. Almost everyone I worked with, had schooling/training in other disciplines, and migrated over to SE (I was an Electronic Technician, then an EE, and then an SWE).
It was pretty cool. We didn't have the extreme conformity that I see so much, these days. Lunchtime conversations were always interesting. We also approached problem-solving and architecture from various angles.
We knew what we knew, by reading programming magazines, books, or joining some local club.
The more lucky ones could eventually connect to some BBS, assuming they could also cover the costs of long distance calls most of the time.
You really had to account for how many bytes and cycles the programming was taking if doing any performance relevant application.
However if that wasn't that relevant, it was still possible to deliver applications in more higher level languages.
1. Processes: in smaller companies there was very little paperwork/red tape. You got the requirements, do the design, have a review, start coding, test, deploy. In most cases I've seen, there were just 2 environments, development and production.
2. Design was very, very simple: not many choices and coding was straightforward, no OOP, a Hello World program had a single line of code. Almost no libraries, frameworks, dependencies. (maybe an oversimplification, but you get the point)
3. Work life balance. We had Duke Nukem 3D parties in the office in the evening, that was the only cases where people did not leave at 5PM. There was no rush, overtime, emergencies - except my team that was doing support 24x7, but that was still fine.
4. Compensation. It really depends on the country, but at that time I had the best pay in my life as ration between my salary and country average. It only declined over time, even if in USD it is a bigger number today. Taxes also raised a lot.
5. Productivity and performance of the code is a lot better, but the life of the developers is a lot harder; the area is just too complex to be really good over time, the number of changes for the sake of change is enormous, the fragmentation of languages, libraries and frameworks in insane. There is no good way to do things right, there are 1,000,000 ways to do it and nobody can compare them all.
6. Not asked, but ...: people were a lot more competent on average. At least what I see in the market todays is developer by the kilograms, with very good developers lost in a sea of sub-mediocrity. Also, the pace of change is so fast, most people never get to become experts in something before it changes. It is like working while running.
I am not a real developer for over a decade as I do architecture and management, but I am the most technical in my area of over 1000 IT people; even if I don't write code full time, I am very close still.
Did some work on various larger systems, IBM (VM/CMS), DEC (Unix).
Shelves with reference books. Lots of low level code (graphics directly on EGA/VGA etc.). Late nights on BITNET or the Internet. USENET/Newgroups.
Remote access with slow modems.
Borland/Microsoft tooling + tooling from embedded vendors (assemblers etc.).
Compensation: low. Work-life balance: None (I didn't have a life back then ;) ). Design principles: Some OO ideas, break things down to reasonable pieces, nothing too structured. Processes: None.
Are things better? I had more fun back then, that's for sure. For some definition of fun anyways ;). At work anyways.
Processes were easier, but I tried to stop people from editing as root on live machines, to have a test setup, to start using CVS (a big step forward from RCS), to start using i18n (multiple language support), and to start engaging in online communities and conferences. There was not much open source but the few known BBS boards and sources, GNU, Freeware and Shareware.
Design was waterfall, proper WBL as of now, compensation was worse, but I did work mostly as engineer and did SW programming only for the job.
Better? Better tools for sure. Tools and compilers are now free, and you have a huge free support system now.
* Reference manuals shipped with compiler and good reference books were gold
* You could learn the entire API/SDK/Framework and keep it in your head
* Blocks of assembly were a legit way to improve your code's performance
* OOP was hitting mainstream along with all the new ideas and approaches (and pain) it entailed
* Turbo Pascal rocked for DOS development
* Source control was iffy/non-existent
* I loved it
Pay was similar to today if you normalize as a function of rent and gas.
I don't recall seeing a test in the wild until I picked up Ruby on Rails several years later, but there was a lot of manual effort going into QA.
I remember there were a lot more prima donna developers out there who tightly controlled swaths of code, systems, etc. and who companies were afraid to fire.
From my perspective, the software development pipeline is much improved thanks to tests and devops and the move to cloud infrastructure has added a ton of stability that was largely lacking back then.
I was just a kid 30 years ago learning to program, by curiosity mostly, on an Amiga 500 using AmigaBASIC and some assembly.
It was neat. The manuals were helpful and nearly complete. You just can't get that on modern computers. The Intel manual is a monstrosity.
Sure, if you made a mistake in your program you generally crashed the entire computer. But that didn't really matter much back then on those machines. You just turned it off and back on and tried again.
It will always feel novel to me though because I had plenty of time to grow up in a world without having a computer before they came into my life. When they did it felt like I was a part of a secret world.
Less so these days of course.
No version control means merges were ... problematic at best, so you developed software without merges, branches, or pull requests, and surprisingly you can run a pre-2K economy pretty well off software written that way ... People tended to own a file at any given time and that meant we organized files along project tasks. We didn't have "no version control" we just didn't use version control tools because none had been invented, well, maybe RCS or CVS but nobody used that outside academia. We could still use backups and restores if we needed an old version. Also all filesystems had something.c, something.c.December.12, something.c.old, something.c.old.old.old, and something.c.bobs.copy. Often source code would entirely fit on one floppy disk so you'd just have a disk with a name and version written on it, and that was your "version control" if you had to review version 1.3 from the past. Also network directories with zip files named projectname-1995.03.12.tar.gz
One step builds were theoretically possible, "make install" was old stuff 30 years ago, but in practice, unit testing and distribution and approval processes made it simpler compared to now. Not every platform language and technology used makefiles, of course.
Everyone had quiet working conditions 30 years ago compared to "open office" so productivity was higher. Things are better now with full time remote.
Some of the old questions are kind of invalid, instead of the best IDE/compilers money can buy, all that stuff is generally free now.
Another eliminated question: As an industry "hallway usability testing" has been eliminated from the profession. There are too many middlemen "project manager" types who get a salary to keep programmers away from users. When someone's salary depends on preventing enduser input, you're not getting enduser input as a programmer. Maximizing enduser happiness is no longer the most profitable way to sell software, so goals are a lot different now.
Later, NEC Multi-sync monitors. And Sony Triniton vertically-flat displays.
RSI became a thing, along with ergonomic Kenisis keyboards. And upscale companies got Herman Miller chairs and padded cubicle walls.
> in terms of processes,
process is a human problem not a tech problem. Process is equally good to crap depending on who you work with
> design principles,
a lot of the questions around design principles have been offloaded to the framework you choose.
> work-life balance
again, human problem. There are crappy bosses and companies and good ones. Hasn't changed.
> compensation.
pretty sure the average dev salary is even stupider now than it was then. Like new grads getting $100k+ at some companies is ... just WTF as far as i'm concerned.
> Are things better now than they were back then?
it was pretty awesome. You could actually be the one person who managed everything on the server.
everything was radically simpler. We used CGI to solve simple problems instead of massive frameworks. JavaScript in the browser was a collection of targeted functions to solve your specific interactive needs instead of a 2nd massive framework.
Even ~17 years ago when Rails was released (and then its clones), it was still just 1 framework that you could wrap your head around that did _all_ the things from the DB to the HTML to the JS.
If I had to summarize I'd say, we've forgotten how to do simple things with simple tools. We've decided that how much RAM you use doesn't matter, and we very rarely care about CPU. We just add more servers.
DevOps is now it's own thing because... there's just sooo much more complexity around hosting that it requires enough knowledge to have become it's own specialization. That being said, I think we've just started assuming that it needs to be that way. That we _need_ to have all these complicated cloud things to serve sites that rarely need it.
Also email's gone to crap. You used to be able to send email from any server. Now you have to use email sending services or you're very likely to have your email eaten by spam filters.
The human social bits haven't changed.
I had an office at my next job, too. Needless to say that's all gone now, except ironically the work-from-home has once again given me a mostly quiet workspace.
[1] https://www.amazon.co.uk/Managing-Gigabytes-Compressing-Inde... (1999)
EDIT: The values are in INR.
Ex.: People were writing about agile, no-code, and the challenges of reducing cost and complexity before we had the terminology for it and before a whole industry of consultants existed to explain it.
Application Development Without Programmers https://a.co/d/2kKeOTx
Now: mainstream, boring, pretentious, infantile.
You earned your pay at night, when a program blew up. The mainframe would provide a core dump, which was printed out on hundreds of pages of 'green bar' paper by the operator. You would go to the source library for an equally gigantic (like 900 pages) of program listing to find the source of the problem. This was accomplished with some hex reading and an IBM card that explained computer instructions and arguments.
'Networking' was always SNA and always secure. TCP was a few years ahead.
C was available on the mainframe a little later. Linux on the mainframe likewise happened soon after, too.
You mastered your application domain by dragging paper listings home to read frequently.
Hot shot programmers may write 'online' programs (CICS), Assembly-language modules (for speed), or maybe do 'systems programming' (mainframe system admin).
It all seems pretty ok now. Probably it wouldn't be fun to go back to it, though.
I was writing assembly code for a fire alarm interface system. Code was assembled using a DOS based system, flashed onto an EPROM for testing on the boards. Debugging consisted of tracing signals through the system, or if lucky, the boards had a spare serial port we could use to print messages.
I started in 1995 building an in-house Windows app for support staff at Iomega. I was a support person, and not a professional programmer, though I had been writing code for 10 years. The project was part of my launch into professional development.
It was a simple program not unlike the type I create today. It did one thing and it did it well. Support staff used it to log the root cause of incoming phone calls. It was used by about 200 employees and then we used the data to try to solve problems that were our top call generators.
Build systems for some languages are much more complex now and the Internet was just getting revved up back then. The best systems to work on seem to be the small simple ones, for me.
Edit: Learning from books instead of the Internet was a major difference. I had some wonderful coding books. A giant book store opened in the mall where I worked (just prior to Iomega) selling discount overstock books. I acquired several dozen computer books and I still have many of them.
Although that was a job while I was still a university student. (Creating the very first web pages for the university). So I was at the very beginning of my career at that time... and to this day my career has mostly been working in academic and non-profit settings, so not typical to compare, so my memory looking back may be colored by that.
But I'd say it was... "smaller". By and large salaries were smaller, not the crazy salaries we hear about on HN (which I don't really receive, to be honest). Work-life balance was a lot more likely to be there by default, things were just... smaller.
There were fewer jobs, fewer programmers, a higher percentage of them in academic settings. Other programmers were mostly geeks, in it for a fascination with programming, not dreams of being a billionaire. (Also mostly all white men, which I don't think is great).
Even 30 years ago there was (the beginnings of) an internet for learning about how to do things from your peers -- largely via mailing lists or usenet groups though. There was a large sense of camraderie available from such text-based online discusisons, you'd see the same people and get to know them and exchange ideas with them.
And sometimes exchange code. I think in some ways 30 years ago may have been close to the height of open source culture. (Raymond wrote the Cathedral and the Bazaar in 1999). As a thing aware of itself, and a culture based around taking pride in collaborating and sharing non-commercially, rather than figuring out how to get rich off open source or "open source". Open source culture was essentially anti-commercial mutual aid.
Also, you still often learned from books. There was always A book about any given topic that had gotten community acclaim as being great, and you'd get THAT book (say the Perl Camel book), and read it cover to cover, and feel like you were an expert in the thing. Which meant other people in the community had probably read the same book(s) and had the same basic knowledge base in common. There weren't as many choices.
I would say things were just... slower, smaller, chiller, more cooperative.
But is this my unique experience or colored by nostalgia?
We did it on 386 or 486 machines in DOS with no internet, though. So that was different.
Since a clean compile of the framework took like 8 hours, we would often kick off a clean build from our Macs before heading out in the evening.
But it was also fun and worth doing.
Personally I wasn't working yet, 30 years ago, so that was just my own side-projects when I had a chance.
https://anarchivism.org/w/Byte_(Magazine)
Many of these tomes are huge (300 megabyte) PDF files.
I’ve been re-reading these via another retro computing site, lower resolution scans, but a complete set and optimal for downloading and offline reading.
And floppy disks? Sucked. Sucked big time. Get too near a magnet and your floppy is toast. Get a box of new blank HD floppies and one or two in the box might be bad. Slow as fuck, too. Everybody should thank Steve Jobs every day for killing floppies.
Total compensation is better now, as long as you get RSUs. But you used to be able to rent a solo apartment near work without roommates, and good luck to junior engineers trying to do that now.
https://fabiensanglard.net/gebbwolf3d.pdf
I have been a dev in places where they didn't know what a story or Jira was...heaven. That having been my experience, I made the mistake of taking a standard corp dev job...fml. Career death.
The pay is good but my soul is rotting.
If you encountered a bug, and it wasn't in the manual, well, then you're stuck until you think of a workaround.
We had an IPX Novell network at school. We developed a chat application called "The Mad Chatter". And it was so much fun being in a class where a teacher trusted us to work on our pet project.
Ridiculously goofy fun.
I got a demo of it running a few months back with DOSBox. https://www.youtube.com/watch?v=fxlie0f7pkE
Back then, I got to work on one thing, because one thing at a time was all that they expected of me. Now I get pulled in many directions at once, and it's really hard to focus on one thing. But that may not be so much because the processes changed, but because my role did.
Thirty years ago, the problems and projects were simpler, but the tools were worse. It kind of evened out.
I think the processes have gotten more complicated because the projects are more complicated. For the same complexity of project, I think the processes have often gotten simpler, because the stakes are smaller. You're building a web page front end for your ecommerce site? Your process is a lot lighter than when people were building NORAD air-defense systems, even if the complexity of both is about the same.
I still have the same limits I did. 100,000 lines of code is about all I can comfortably deal with; after that I start getting lost.
As I have gotten older (and I worked with older co-workers), the level of interpersonal maturity has gone up. That's been a good thing; both I and some others used to be jerks, and now we're not (or at least not as much). Again, though, that's just tracking with my age and seniority, not the industry as a whole.
I worked primarily on an operating system which supported key-value stores as native file objects (DEC VMS/RMS).
I worked on business apps in VAX BASIC. Actually, I spent a fair amount of time abusing compiler pragmas so that I could e.g. do memory management in VAX BASIC. One of my long-term client was the guy who invented the report wizard. Literally. The prototypical cockroach company, at one point he travelled around the country in his van helping people install his software so that they'd try it out. There wasn't much in the way of advertising for software, recommendations were largely word of mouth.
I helped write SCADA systems in VAX Pascal. I wrote a DECNet worm; no it didn't escape. I'm probably (at least one of) the reason(s) that DEC implemented high water marking in RMS.
I did LIMS in HyperCard. Very primitive by today's standards. Things like ELISA notebooks. Makes me wonder why with the CI/CD fetish these days there is no GUI for building web GUIs: why are people writing code the old way? (I have more than opinions about this, and I know of one high quality GUI for writing web GUIs.)
There wasn't "open source" as we know it. As a contractor I developed a doctrine of "tools of the trade" because there aren't many good ways to write e.g. string compare.
My first ever CGI was written in DCL, the VAX VMS command line language.
About a third of my work was firm bid. Contracts were generally reasonable, clear, and written to keep the parties out of court.
People scoff these days about stack overflow, but we were more reliant on examples in print that sometimes wouldn't work. Stack overflow is just a resource only slightly les trustworthy than a first edition manual. On the flip side, most coding was quite simple, but for internet work you needed to know perl or c++ as well.
References were books. "UNIX In A Nutshell" and "PERL In A Nutshell" were pretty much on everyone's desks, in addition to the K&R C book and various C++ books.
Systems were simpler in that you didn't have multiple frameworks to deal with.
My teams built supporting tools for a system on OpenStep so we coded some in ObjectiveC and PERL to support file systems operations. Then our lead found Python and migrated our PERL stack to Python which was great but it was Python 0.5 so no published documentation at that time. I wish I kept a copy of my emails and responses with Guido Van Rossum because he was the only support available!
Very fond memories ...
- There were no containers/Docker yet so everything had to be installed on server every time manually or by a hand-hacked script and not always easy to repeat on the next server. Produced mystical bugs. Glad we have Docker today.
- Source code control system of the era sucked vs git. Back in the day it was CVS and VSS, then everyone got hooked on SVN about 20 years ago, then there was a duopoly of Mercurial and Git, and finally, Git won about 10 years ago. Life got a lot better once we got Git.
- There was no virtualisation yet. VMWare just appeared around the time, being a very slow software emulator, barely usable. So you needed to have several physical computers to test things on, necessitating an office for at least management and testers, others could work remotely.
- Internet sucked, was good only to send mail and commits, and some task tracking systems. Voice communication relied on phone calls before Skype, video communication was impossible, so physical meetings were necessary for remote and partially remote teams.
- Worst of all, there were no package managers yet so building something from source required an awful lot of work. This is why Perl was so popular in the day even being so terribly cryptic - it had a package manager and a big library of packages in it, the CPAN. It was almost as good as npm today and used same way, and now this is expected from any language or platform, but back in the day, was unique.
(and yes, at least Linux was already a thing. i can only imagine how bad it was for those who worked just 5 years before when they had to rely on commercial server OSes)
I probably would not enter the field today.
1) Square feet of desk space
2) Ability to divert incoming phone calls to voice mail
Square feet of desk space might sound absurd and pompous, but it was a real advantage. Large monitors and powerful IDE's and code search tools were still decades in the future. As a programmer you used a printer and you kept stacks and stacks of printed code on your desk. The more square feet of desk you had, the more efficiently you could lay out and sort through the printed code you needed to look at (this is why Elon told Twitter engineers to print their code - he's a boomer programmer who grew up in this era when that really was how you read and reviewed code).
Phone calls to voice mail is more obvious today, but that study was I think the first to point out that programming was a flow state activity where it typically took about 10 minutes to get your mind loaded with all the info and context required for peak programming productivity. They observed that one two minute phone call was enough to wipe all the context out of your mental stack and drop you from peak productivity back to minimal productivity. If you had one small interruption every ten minutes you never hit your peak flow state as a developer. (With the massive improvement in IDE's and etc, I'm curious if that preload timescale remains consistent or has dropped but I'm not aware of anyone doing more recent work on that question).
The machine was basically a home computer running a repl with BASIC, beefed up with a handful of input/output ports to get data from sensors and operate pneumatic actors on the rig.
No graphics beyond what you could do with ASCII on a monochrome monitor.
In the end it came to ~2800 lines of code including comments.
Design principles: a) make it work and b) make the code as readable as possible.
Work-life-balance: it was a normal engineering job, so basically a 9-to-5 thing. Although I did put in a saturday appearance at the end to get everything done (including manuals) before my stint ended.
All-in-all it was as much fun as one could have in test automation at the time; writing the program from scratch and adapting the rig where necessary.
No internet or even cell phones, I didn't have a computer at home - if something broke you had to wait until the next day when people were back in the office.
These days, if you strike a problem, you can get feedback from thousands of guys on the Web who have been there before you. Back then, you didn't have those guys and you had to gradually work out your own solution to the problem. Sometimes you didn't. And had to abort that project.
Before coming into contact with CVS or GIT, the method I used when I was unsure a change would be productive was to simply copy the block, comment out the copy, then make modifications, along with notes about what was different, and when
There were lots of lines like
; moved xxx 92-02-01 MAW
After going around and replying "no" to a half dozen or so different individuals with increasingly philosophical responses he broke it down for us with, "you're all here because 60k! You're going to graduate with a computer science degree and make 60k a year!"
to even imagine that was supposed to be an inspiring number back then is pretty laughable. I was already making as much doing chinsy web dev stuff.
It was a different time for sure. There were no (or very few) laptops, you took notes on paper, you went to the computer labs to do work (with old sparcstations and such) or you remoted in to your shell and used emacs and gdb. Pretty simple times.
Therefore, can't say much about the work part. It may be result of having grown up, but the most significant change I feel is perspective. Back then there was no tomorrow. No worries (or hopes, or plans) that anything created would need to maintained, or obsoleted/replaced by anything in future. Everything was right here, right now. Today, anything comes with an expected life-cycle, at least implicitly. Constant worries that the next minir OS, browser, language, ... update is going to break something. Back than, if it ran on MsDos 3, it woukd run on 6. And most likely in a command window of windows, and OS/2, too.
Work wise it is still the same deal.
UI toolkits cost money.
So did compilers.
However, the issues then are still the issues of today: * software maintenance is not exciting but is critical. * designing software & systems still requires critical thinking and writing skills. * communication is a skill most people still need to learn and practice. * yes there are better technologies and artificial intelligence will save the day, but people are at the heart of all of the needs software is trying to satisfy, so understanding human behavior is very important.
There's this: back then I made good hey from my excellent memory for APIs and minutia of system behavior. That's a completely unimportant skill now. Now you can look any of that up in a few keystrokes (at worst -- probably your text editor just pops up the relevant info as you type).
But my main skill is as useful now as it was then: I find solutions to problems. And I don't fall in love with a solution before I fully understand the problem (which is really never).
People nowadays complain about algos and red-black tress but honestly, that was the easy bit. Wasn't much open source, so you had to build pretty much from scratch. Internet was young and empty, so big fat books were how you learned. No condensed version or easy trouble shooting. C and C++ were as dominant as Python is today, but nowhere near as fun. (https://xkcd.com/353/)
In short, the deal was to be a cog. If you were good as a small cog, you'd move up to be a bigger cog. Then you could manage a few cogs doing a tiny bit of a huge machine. The scrappiness of just throwing things together and getting something meaningful quick simply wasn't there.
I left, spent 15 years of my career doing decidedly different things, and when I came back I was overjoyed with how little code you actually needed now to get stuff done.
The source code is here, by the way: https://github.com/mvindahl/interword-c64
Still, a few general observations about that particular corner and that particular time of software development:
- There were multiple successful 8-bit platforms, all of which were very different from each other. Different makes of CPU, different custom chips, different memory layout. You could be an expert in one and an absolute novice in others.
- The platforms were more constrained, by magnitudes. A very limited color palette, far fewer pixels, far less RAM, and far slower CPUs. For a semi-large project, it could even become a challenge to keep the source code in memory and still have room for the compiled machine code.
- On the upside, the platforms were also far more stable and predictable. A Commodore 64 that rolled out from the factory in 1982 would behave identically to one built five years later. Every C64 (at least on the same continent) would run code in exactly the same way.
One thing that followed from the scarcity and from the stability is there was an incentive to really get close to the metal, program in assembly language, and to get to know the quirks and tricks of the hardware. Fine tuning an tight loop of assembly code was a pleasure and one could not simply fall back on Moore's law.
It was a simpler world in the sense that you didn't have to check your code on a number of machines or your UI on a number of window sizes. If it worked on your machine, it could be assumed to work everywhere else.
Another thing that I remember is that there was more friction to obtaining information. The internet wasn't a thing yet but there were text files flowing around, copied from floppy to floppy, and you could order physical books from the library. But a lot of learning was just opening up other people's code in a mchine code monitor and trying to understand it.
Some of these things started to change with the Amiga platform, and once PCs took over it was another world, with a plethora of sound cards and graphics cards and different CPU speeds that people had to deal with.
The big thing is, the internet was new, nerdy, most people didn't even have it, and I ended up carrying around the big thick manual my calculator came with because there was nothing else. Google didn't exist. And in fact I'm not sure what if any search engines there were at the time.
Developing agile as is done today is better though.
Oh and before PHP came about it was mostly CGI-Bin
You’d often get, “can you start on Monday?” during interviews, 100x better.
Today we have better tools, folks are more serious about reducing bugs, projecs are better about avoiding well-known hurdles, all good.
Compensation is theoretically better but due to rises in housing, education, medical costs I’d say it’s a bust.
I do remember liking Brief a lot though
Didn't exactly position me well for the future, haha
- No SaaS.
- No “information super highway”.
- Bigger focus on D2C marketing due to information asymmetry
The best part was a feeling of hope. The industry was full of "evil" companies like Microsoft and IBM but there was a pervading sense that everything was headed in a fun and open direction which would surely lead to a better tomorrow. Nobody loved Microsoft per se but they were absolutely more open and fun than IBM, and Linux and the interwebs were on the horizon and surely we'd have a decentralized digital utopia in another decade or two.
You felt like you were on the cutting edge of that, even if you weren't working on something particularly cool. Kind of like Neo's boring day job vs. his night life.
processes
Source control and automated tests were rare/nonexistent at many companies.They were surely standard at larger software companies but not in smaller shops.
design principles
It was an inversion of today, in a way.In the 1990s developers had a more solid grasp of computer science-y fundamentals like algorithms and data structures. But we were far less likely to know anything about best-practice design patterns, architecture, and that sort of thing.
People (including me) complain about how modern software engineering is less programming and more a matter of choosing a framework and fitting existing bits together like lego bricks without really knowing how anything works.
What gets talked about less is how frameworks like Rails, React, etc. are generally (generally!) built around tried and true paradigms like MVC, and how this is something that was much more likely to be lacking in software development projects 30 years ago when everybody just sort of "rolled their own" version of what architecture ought to look like.
You had genuinely smart people trying to do things like separate app logic from presentation logic, with varying degrees of success, but even when it was a success it was like... some random nerds idea of what an architecture should look like (possibly loosely based on some architecture book) and it was a big learning curve between projects.
work-life balance
Working from home wasn't a thing, which was both good and bad.A lot of late nights in the office. This was perhaps more true for folks like me doing early web stuff. The technology was changing so quickly under our feet, and was so inconsistent between browsers and browser versions, everything was a mess.
It was probably a little more sane for the guys working with Visual Basic, Delphi, FoxPro, whatever.
And when I say "guys", I really mean guys. It was incredibly rare to see women developing software. It was cranky old greybeards and pimply-faced college geeks who drank a lot of Mountain Dew. Just way more nerdy in general and not necessarily the good kind of nerdy.
"What was Coding like 40 years ago?":
And a subscription to the Microsoft knowledge base which came on CD (in addition to the books others have talked about)
And I vaguely remember that the debugger inserted itself between DOS and windows, when meant that it could bring crash windows if something went wrong.
Fun, but slower than today.
No internet. Same problems.
When I started, I was coding on a 24x80 CRT connected to a serial-port switch so I could connect to several different systems (e.g. PDP/VAX). A bit later I got a 50x80 terminal with two serial ports and that was considered luxurious. I also worked on Macs and PCs, which were a bit better but less than you might think.
Related to that, lacking anything resembling a modern code-navigating autocompleting tooltip-providing IDE, people actually wrote and updated and read technical documents. Reviewing them was also a Real Thing that people took seriously. Also, books. Another commenter has mentioned Barnes and Noble, but if they even existed then I wasn't aware of them as a resource for technical stuff. The one I remember was Quantum Books in Cambridge MA (later Waltham). It was common to have a bookshelf full of manuals for various UNIX flavors, or X (the window system), or Inside Mac, depending on what you were doing. If you didn't have the book you needed, maybe a coworker did so you'd walk around and ask/look.
There weren't a bazillion build systems and package managers and frameworks. You had tarballs (or the cpio equivalent) often used with for-real physical tape, and good old "make". Automake and friends might have existed, but not near me. It might sound limiting, but it was actually liberating too. Just unpack, maybe tweak some config files, and go. Then see how it failed, fix that, and go around again until you got something worth working with.
A lot of things that seem "obvious" right now were still being invented by people much smarter than those who now look at you funny for suggesting other ways might even exist. I actually worked with folks who developed now-familiar cache coherency algorithms and locking mechanisms (now embedded in ISAs) and thread models and so on. Not much later it was distributed-system basics like leadership, consensus, and consistent hashing. These kinds of things either didn't exist at all or weren't well known/developed. And it was fun being part of that process.
It was possible to know all parts of a system. At Encore I was able to describe everything that happened from hitting a key to a remote file being written - keyboard drivers and interrupt handlers, schedulers and virtual memory, sockets/streams and TCP/IP, NFS, filesystems and block-device drivers. Because I'd worked on all of them and all of them were simpler. I was far from the only one; those were not far from "table stakes" for an OS person. Later in my career it was practically impossible for one person to understand all of one part such as networking or storage. That makes me kind of sad.
Work/life balance was, perhaps surprisingly, not all that different. Crunch time was still a thing. I worked plenty of all-nighters, and even worked nights for extended periods to have access to scarce new hardware. I had more intense and less intense jobs over the ensuing years, ending on one that was at the high end of the scale, but those were all choices rather than changes in the surrounding environment.
Compensation was good, but nowhere near modern FAANG/unicorn levels. Maybe founders and near-founders haven't seen it change as much, but for the rest of us the difference between a top-paying and a median senior/principal/staff kind of position has become much larger. 90% of developers used to be in the same broad economic class, even if some were at the low end and some were at the high end. I'm not sure that's really true any more.
That's all I can think of for now. Maybe I'll edit or reply to myself if I think of anything else later.
My first 10 years on the job was Turbo Pascal and Delphi for various shops. Working in that old DOS-based Turbo IDE felt like magic, I remember plumping for a shocking blue, yellow and pink colour scheme - I miss Pascal. The move to Delphi was a huge change, OOP, native strings over 255 long and some truly unbelievable drag-and-drop GUI building functionality.
We had no source control until we started using Delphi, I think it was Subversion but there might have been something before that. Prior to SVN it was a case of baggsying report.pas for the day.
Thinking back, and maybe I've forgotten, but I don't think we shipped anything that was particularly worse than stuff I see getting shipped today. Yeah, stuff went wrong, but it still does. Without reviews, Git, CI, etc we still shipped something that largely worked and kept customers happy.
Code quality was bad. No standards were followed across the team, so files all had different styles. It wasn't uncommon to see procedures that were 100s, maybe 1000s, of lines long. Turbo Pascal's integrated debugging was a life-saver.
Unit testing was not a thing.
I think we wrote far more stuff ourselves, whereas today there's a lot more libraries and systems to use instead of building.
Obviously there was no Stack Overflow, I signed up to that when it first came online, it has been a game-changer. I read a lot more programming books back then, you had to. I think there was a lot more try-it-and-see work going on, I used to write many small apps just to work out how some thing needed to work before touching the main codebase, that's something I still do a lot today, I'm not sure the new-bloods do?
Work-life balance was absolutely fine, there was no pressure to work extra hours but I don't find that there has ever been. I've always prioritised family-time over work, I put in full effort for my contracted hours, the second they are up, I am gone.
I certainly enjoyed programming a lot more back then, it felt closer to the metal, it felt like a voyage of discovery, I couldn't just Google to find out how to pack a file down to 50k whilst keeping read-time quick, I mostly had to work it out myself, read a book or ask colleagues. You had to work harder to learn stuff, and I don't know, it felt like that hard-won knowledge stayed with me more than something I googled last week.
Moden languages have abstracted away a lot of the complexities and that is of course a good thing but I kind of miss the pain!
Thankfully, the much smaller size and scope of a typical software project partially offset the lack of sophisticated tooling.
In 1992, everybody knew that object-oriented programming, with languages like C++, was The Next Big Thing, but that future was not here yet and most everybody grovelled around in C or Pascal. This kind of thinking led to object-oriented bags on the side of even Fortran and COBOL.
Oddly enough, no-code tools did exist, but they were called CASE tools and were just as bogus then as today. GUI builders like Visual Basic hit the market.
On the upside, if you had a knack for computers it was easy to start a career in them. Programming was seen as a great career path for smart people who were "on the spectrum" (Asperger's syndrome was just barely entering public awareness). It was a lot more technical then than now so you really had to know how things worked at a low level. These days the "people persons" have really taken over and you really need to be one in order to thrive in the field.
Plus it was just a lot more fun then. People thought computers becoming mass-market devices would change the world for the better. Ads were for television and newspapers, not manipulative "tech" companies, and most programmers in the industry -- yes, even Microsoft, who weren't the evil empire yet in 1992, that would've been IBM -- really wanted to produce something useful, not just something that drives clicks and engagement. People also wanted to just mess around a whole lot more. Sometimes you'd hit the jackpot with your idea, but the important bit was getting it out there, not necessarily making millions with it. A college student named Linus Torvalds started writing an operating system just to see if he could. (Back then, operating systems were A Big Deal and could only be written by Real Programmers with decades of experience. Word was that Real Programmers working on Real Operating Systems would put the Linux source code up on a projector after hours, crack open a few beers, and have a laugh.)
It was a lot more "wild west" back then, but easier to find a niche, especially for individual programmers. Sometime in the late 90s this consultant douchebag for Microsoft decreed "beware a guy in a room" and so the focus from a management standpoint became "how to build the best software development hivemind" and we've been dealing with the effects of that since. That and the phenomenon of people who want to sit at the cool kids' table, but can't, so they take over the nerds' table, because better to rule in hell than serve in heaven.
1. Processes: mostly, winging it. But since I worked in a regulated industry, there was a lot of very specific stuff the Feds expected of us; and almost no internal documentation to get someone up to speed on it. Downside was you often did stuff wrong once before some Senior (which in those days meant more than a couple years out of college) made you do it again. But this was part of learning, and I'm sure they learned it that way too.
2. Design Principles: um, mostly winging it, but I'm sure it was less freewheeling in parts of the company like manufacturing. For those of us working mostly with data, we were all very interested in outcomes, and some of us cared a lot about the users, but a lot of the software was solving problems that had never been solved with software before, and we made a lot of it up as we went along. Towards the end of my first job I went to a big HCI conference and thought, hmm, there is a lot of jargon and long-winded academic presentation going on -- but is anyone outside the bubble listening? (I guess OS designers were listening, and I thank them for their sacrifice.)
3. Work-life balance: we worked hard, and yes we played hard (not just the young) -- but it was pretty much in line with everyone else in the SF area working in high-energy industries at the time. There was no expectation of better or worse working conditions if you were doing software versus anything else you might do. You had an office if your peers had offices. Then later, with the startup craze, all-nighters and lofts and so on came into vogue, but that was self-inflicted (and at that age, I really enjoyed it).
4. Compensation: we got paid as much as the other professionals, and not more, but usually also not less for our relative place in the org charts. It was nothing like the situation now, where there are maybe a dozen professions that get way higher pay than their equals in other professions; nor was there such a big gap between the US and other rich countries. But then, SF was a cheap place to live back then. Much changed and it changed fast. (For one actual data point: when I finally made it to "official computer guy" was when I started making more than my local bartender, who until then had made the same as me but only worked three nights a week.)
And as a general riff:
Almost nobody was in it for the money, which is not to say nobody was trying to make a lot of money. Rather, everybody I encountered who was doing software was more or less destined, by their talents and affinities, to be doing software at that moment in history. The people who were primarily about the money got out of coding as fast as they could, and did management or started companies, and that was for the best.
Everything was magical and new magic was being invented/discovered every day! I myself got to go from working on character terminals to Macs to Irix and back to Macs again; to discover Perl and write ETL programs as what was then a "junior" -- but in today's world would probably be a "senior" because I had two years' experience. I would read man pages like people read (used to read) encyclopedias, which is kinda what they are. I had a housemate who would get high and write what we would now call the "AI" of a famous video game. I wrote my first CGI programs on a server at best.com. Every step that now seems mundane was, at the time, amazing for the person taking it for the first time -- writing a crappy GUI! client-server! relational databases! friggin' Hypercard! A friend was given a do-nothing job and became an expert at Myst before he quit, because... magic!
The path from writing code to business outcomes was a lot shorter than it is now, and I speculate that this put a limit on the gatekeeping attempts by folks with whatever "elite" degrees or certificates were in vogue.
And there were a lot more people (relatively speaking) who came from different backgrounds and, like me, had found themselves in a time and place where "playing with computers" suddenly was part of the job, and a springboard from which to grow into what we eventually started calling, correctly IMO, Software Engineering.
But back then, in biotech and in the bright shiny new Internet world I joined around 1998, boy oh boy was it not Engineering. It was inventing! It was hacking! And it was a lot of fun.
My main development machine was an RISC OS machine with a 20 MB hard disk (which crashed the day after I got it - glad I made backups on floppy disks!). Back then I would make printouts of programs to refer to on tractor fed dot matrix paper. The idea of printing a program seems very old fashioned now!
My editor of choice was called !Zap which was like someone ported the essence of emacs to RISC OS. It wasn't until 1998 that I used actual emacs and felt immediately at home.
I had lots of reference books. The giant programmers reference manual for RISC OS, and dozens of CPU manuals (if you asked the chip manufacturers nicely they would send you a datasheet for free, which was usually more like a book). I had a few books on C but I didn't consult those much as C is easy to keep entirely in your head.
As for process, and design principles - they could be summed up as "Get the job done!". I had very little oversight on my work and as long as it worked it was up to me how to make it work.
Compensation was excellent, programmers were in demand. So much so that I set up my own business.
Computers have got so much faster since then, but it doesn't affect the job of programming much - it is still the meat component that is the limiting factor.
The internet (which I didn't get access to until 1993) has completely changed programming though. So easy to look something up, whether that is the arguments to memcpy or an algorithm for searching strings. It wasn't until the internet that the open source movement really took off - distribution was a real problem before then. I used to download open source/freeware/shareware from BBS or get it from disks on magazine covers.
Having access to high quality open source libraries has made programming much better. No longer do I have to write a bug ridden version of binary searching / quicksort / red-black trees - I can just use one from a library or bundled with my language of choice.
Not having to write everything in C / Assembler is a bonus too! I didn't meet Perl until 1998 and then it was transformative. I soon moved onto python which I still use a lot as my general purpose glue language. I'm a big Go fan, it seems like C without the hard parts, with concurrency and a massive standard library.
Are things better now? Mostly I think, though there is often far too much ceremony to get stuff done (here's looking at you Javascript ecosystem!). Programmers in 2022 have to build less of the foundations themselves which makes life easier, but it takes away some of the enjoyment too.
Since my 14th I joined a computer club. Basically a bunch of nerds from the age 14-60 working with computers 24/7. Some ran Unix like systems, most had a Windows system or RedHat. You had to take your desktop computer and (huge) monitor with you to every meeting. It was a whole operation just to get there.
Some guys worked at Philips (now ASML) and had very early version of COAX nic's. We built our own LAN networks and shared files over the network. We played with network settings, which was basically what we did all day. Lots of terminals and Dos screens. Later Netscape was the browser of choice. When the Windows era really took off some of us started to hack Windows 95 machines. Often with the use of telnet and a bunch of commands you were able to find a lot of information about a computer. Sometimes it was just as easy as sending a picture with a hidden backdoor with tools like Cult Dead Cow. We did it for fun. There was no scamming involved.
We used ICQ and IRC to communicate online. Later there was MSN Messenger which was great to communicate with classmates. There was no cable internet. You had to pay per minute for internet over the telephone line.
Free software was available on Warez sites or FTP sites. Also newsgroups but this was mostly for reading news. I had FTP servers running everywhere. People connected to them through the whole world. Some left a thank you note after downloading music or the newest Winamp version. There was no Napster.
We went to LAN parties. Basically a university opening their auditorium for 150 nerds with huge desktop machines drinking Jolt coke all night. Some companies sponsored the events and tried to recruit java developers. There were games to hack a computer. Big part always was the opening where the (hack) groups presented themselves with 3d movies. The quality was insane for the time.
Also during Windows 95 lots of software ran in a DOS prompt. BASIC and Pascal were languages often used for this. The applications were not structured in a great way. You had open each file and analyze it. I can't remember developers used many comments in the code. Files were fairly easy to read and understand. There weren't that many references to other files.
If you wanted to have a DOS prompt with your application you had to write everything, even the mouse cursor moving over the screen. There were no packages or other predefined codelines. There was no autocomplete, code checks, stack overflow, or decent debugger. There was even a time without line numbers.
It was the most fun I ever had on a job, despite working on an "uncool" (non-Unix) system, largely because of the really smart people there, and the opportunities to do fun stuff (e.g., writing Rexx plugins to allow direct control of devices, including handling device interrupts, for use in tooling). Also, being young and less experienced -- so everything seemed new -- helped.
Processes: Until 1990 or so, we had a dedicated person who served both as "the source control system" and release manager. Once a week, we submitted changes to this person, she merged them with other people's changes, and put the merged files onto the official source disk. She also built the weekly releases (which were installed on our hardware every Saturday night). I am not sure what happened after 1990... I think we rotated that job between each member of my team.
I also believe, maybe incorrectly, that stuff was far better documented back then. We had a giant rack of IBM manuals that pretty much had most of what you needed. Some of the more experienced workers served as human Stackoverflows. We also had access to some bulletin-board like system that served as a discussion group for VM/370 systems programmers, although I only used that once or twice.
Design principles: I don't really remember much about that, but for big changes we did a lot of writing (by 1992 we might have starting an intranet-type thing, but before that we just distributed these proposals as hard-copy). I remember we had tons of memos in our memo system, with keywords cross-referenced in an index. I used to peruse them for fun to see how something ended up the way it was.
In general, we documented a lot: we wrote memos for change proposals, for what we actually ended up doing, the root cause of system crashes, tools, etc. We would often update those memos (memos had versions, and all versions were kept). I guess our memo system was sort of like a company intranet, but somehow it seemed less crufty and out-of-date, but maybe it only seemed that way because there was no search to turn up deprecated documents.
Work-life balance: Not great, I guess, but that partly could be on me. I loved my job, so I worked too much and I think it did have some long-term negative consequences. But there were deadlines and system crashes that needed to be figured out. There were periods with lots of weekend work: We had a hypervisor, so we could test hypervisor changes in a VM whenever. But for big changes, we needed to test things on real hardware, and we could only do that when the production systems were shut down late on Saturday night/Sunday morning.
Compensation: I really have no memory of what I was making in 1992. If I had to guess, I would say around 45K, which is about 81K in today's numbers. So it was a little on the low side, I guess, for a programmer with 7 years experience. But I didn't know any better, so I was happy (I had no idea what anyone else was making, and I could afford to live on my own and have the occasional electronic toy).
In 1992:
- Every programmer had at least seen, and possibly even coded in, _every existing language_. I had college classes that touched on or taught COBOL, FORTRAN, Pascal, BASIC, Ada, C, C++, and even a couple different assembly languages. The current proliferation of languages can be a bit overwhelming when you're striving for the "right tool for the job."
- Using C++, especially cross-platform, was an act of frustration due to lagging standards adoption. Like others have said--you only learned from books, and stuff that the books said you could use, like templates and exception handling, just didn't work on a ton of different compilers. gcc/g++ was still rough--using the compiler provided by the OS was still the norm.
- UNIX boxes like Sun or SGI workstations were _crazy_ expensive, but if you knew the right people, you could buy used stuff for _only_ the price of a Honda.
- There were 20+ different UN*X variants already, all with only a modicum of compatibility. 16-bit SCO, 32-bit SunOS, and 64-bit DEC/Alpha architectures made porting...a challenge.
- 1996 was the first time I saw my first Linux box in the wild (at a UNIXWORLD expo, I believe? The star of that show was Plan 9.)
- Agile/XP/scrum was years away from common adoption. Code reviews were nonexistent. Pair programming nonexistent. Continuous Integration nonexistent. Unit, system, and integration tests were a decade away still: QA was a team, and _manually_ tested builds using a test plan.
- To be productive within a code base took time due to lack of standards, making then somewhat impenetrable (sometimes by design for job security), and system setup took days (no docker!). Some people used `make`, but there were a ton of other home-grown tools as well.
- Source code control wasn't widely adopted. Some companies didn't see the need. This is pre-git and pre-subversion--some people used CVS/RCS. To get a tree, you had to rsync from another dev's tree, or a master shared filesystem.
- Having someone else look at your code was unusual, compared to typical pull-request-style workflows of today. No bikeshedding issues back then, but also, no real opportunity for mentorship or style improvements.
- There wasn't really an idea of "shared open source libraries" yet. CPAN (for perl) was one of the first, and started in 1993.
- You had to actually _buy_ tooling, like a compiler (like SUNWspro, Visual Studio, or the X-Motif builder toolchain). It made coding at home on a budget a lot more limiting.
- Work/life balance back then, just like today, varied widely in the silicon valley. You had to pick a co that met your needs.
Work was mostly well distributed/planned well in advance and overtime was only around field testing/release dates/acceptance test etc, none of this on demand agile bs that we've these days. Every major/large feature was broken into requirements (or use-cases in some projects) and these were were written by a very senior person (or a group of them) and it would be reviewed for every spelling mistake (I kid you not) before handing it over to the dev team. We used to have workshops where people from different modules (old name for the modern micro services) would sit around a physical table (not a slack/teams meeting) and would pore through the printed document or on their laptop and one person would literally take notes on what was discussed, what were open issues for the next meeting etc. Only when every requirement was completely addressed it would get a green signal to move to dev.
Test / Dev were different teams, and in companies where I worked V model was popular where a test team would write test cases and dev team would write code against these requirements. Testing was a vertical in itself and while devs usually handled Unit/Module testing, system integration testing, field testing, customer acceptance testing were done by dedicated teams. The goal was to capture 90% of defects in MT, some 7-8% in SIT and only 1-2% from field and theoretically nothing post release. We (devs) used to have goals given on how many bugs can be expected from each of the phases to determine our quality of coding. Code reviews had a reviewer, moderator, approver and so on making it a very important event (not the offline bs that happens today). A post release bug would be a nasty escalation on both dev/test teams.
Did I also mention that the MNC had a tech support team who had good knowledge of most systems at high level, worked in shifts and unless there was a bug which required code change, would be able to handle & resolve most escalations from customer. Bugs requiring code change would be sent to the dev team only after a formal handshake between dev and support teams. The bugs would get treated the same way like a feature, and went in maintenance packages released every once in a while (same cycle of dev/testing as features)
There were separate teams in some projects, one for bug fixing of previous releases and one for building new features for an upcoming release and they used to be rotated out after a release !
I always thought that moving to agile/scrum would make life easy and fast. While it shortened release cycles, software quality has taken a huge hit, most code these days are copy/pasted , reviews are mostly lip-service and the end result is that most engineers are forced to do pager duty and are called to fix the mess they made round the clock. Interview processes are mostly focused on irrelevant ds/algo questions and abstract design problems with little to no emphasis on a candidate's experience. I had one interviewer tell me that they really don't care about a candidate's experience but only his performance in the interview matters (yeah, no sh1t sherlock, explains why the company needs 24x7 on call dev support !)
Call me old fashioned, but I do miss the old way of building boxed software (plan/analyze/design/code/test/ship and maintain). Work was relatively more predictable and office felt like a place where people actually collaborated and worked together to build something they could be proud of !
In 1988 I was a 25 yo working on for a 10ish person KP funded start-up that wrote a Mechanical CAD package that ran on Microsoft Windows 3.0. The premise was that PCs would take over, that mini and micro segment would disappear, that VARs would no longer be necessary to sell hw/sw and train people to use apps.
The application was written in C (not C++) for Windows. It took significant parts of an hour to compile (see the XKCD comic on sword fighting during compiles). Some of the demos we'd do would be on the COMPAQ luggable machines. We'd find bugs in the Windows API. We'd write our own object-oriented DB that lived in memory and disk. The "Algorithms" book was 18 years away. Most of the team had been through 6.001 (THE 6.001) and had that as a basis. We had to solve pretty much everything -- no real libraries to drop in. Our initial network had a single 68000 based Sun machine with SCSI hard drives (10MB then 100MB as I recall) running NFS, with PC-NFS on all of the PCs, connected via coax cable ethernet. We used CVS as our source control. We later got a SPARCstation to do ports to Unix, and it was very much a thing to port separately to Sun, Intergraph, and SGI workstations since the OSs were different enough.
The first version took about 2 years (hazy...).
And after you'd written the product on Windows, to get it to RUN well we would write programs to do runtime analysis of typical app usage (watching swaps in and out of memory) to build custom linker scripts to pack code in a way that minimized the amount of program paging in and out of memory, since PCs didn't have much memory in those days. I'd find out a couple of years later this is how MSFT did it for their applications; they didn't tell us, we had to figure this out. Developers were Developers. Testers were Testers. Testing was done primarily with running through scenarios and scripts. We were date driven, the dates primarily driven by industry events, our VC funding, and business plan.
As we got ready for releases, I recall sleeping under my desk, and would get woken up when bugs were found related to "my area." That company was pretty much everyone's life -- we mostly worked, ate, exercised, hung out together, and we were always thinking and talking about "the product." There was this thing called COMDEX that would take over Las Vegas each November, as the SECOND biggest show for that town. The first was still the Rodeo :-). If you were in PC hardware or software, you HAD to be there. Since some of the team members comprised core members of the MIT blackjack team, when we went to COMDEX there was some crossing of the streams.
Design principles? Talk it over with the team. Try some things. I can't recall compensation levels at all.
That company got purchased by a larger, traditional mainframe/mini CAD/CAM vendor, about the time that I was recruited to the PNW.
Things better, or worse than today? That REALLY depended on your situation. As a single young person, it was great experience working at that start-up. It was a springboard to working at a mid-size software company that became a really large software company.
Today, it CAN be more of a meritocracy, since there are ways to signal competence and enthusiasm by working on open source projects, and communicating with other developers. It's easier to network now. It's HARDER from the perspective of there are larger numbers of developers in nearly any area now than ever, and geography just isn't as important. But I also perceive that most people are less willing to make trade-offs like spending extra time today finishing something while it's still top of mind, vs. "knocking off" and doing it tomorrow. That could just be my perception, however.
I still like working hard.