## Scope:
1. use a VM for now
2. only use the tooling available at the time
3. a simple "hello world" site first, then
4. something a bit nicer, let's call it "Astro Marmalade" (any similarity to Space Jam is completely incidental)
5. find a way to publish it and access from the VM
6. nice to have: publish and access it from a modern browser
## Specs of the PC I want to emulate:
- 8MB RAM
- Pentium P60 (60 MHz)
- 428.4 MB HDD
- Windows 95 (with some badass skins)
(The PC I had in 1997: https://allegrolokalnie.pl/oferta/komputer-5p60-optimus-pentium-60mhz-oryginal)
My setup was Notepad → CoffeeCup HTML + NetScape (OK a Polish fork called Sylaba IIRC). I built 6 "personal sites" this way, but I didn't have internet access, so I have no clue how I'd publish content.
## Questions:
1. What VM/tooling recommendations do you have for an M1 Mac simulating this environment?
2. Role-play/travel back in time. What tools do you use? And, what's the best, most authentic workflow?
3. Anything else I need to know?
In 1997 I worked on a commercial/intranet web app using Perl 5, with CGI.pm, behind Apache httpd, on a Unix box. I think we supported Dec Alpha, Solaris, and IRIX?
These all (well, not really the OSes) still exist in modern forms, mostly unchanged, so if you really want to make it public, at least have a modern Apache facing the world. If you use a late 1990s Linux distro like https://archive.org/details/debian_1.3.1 you'll be using effectively the same environment.
What I did in 1997 was to implement my own template language, so I could have Perl process it to add the header/footer. It was something like "$$ insert filename.html $$" or something else dead simple like that.
I used xemacs as my development environment, though of course many used vi back then too, and I used NEdit for a while in the mid-1990s, which is more like Notepad.
We used CVS for version control.
To publish content now, using your home computer, get your router to forward external connections for a given port to the M1 Mac. Then use a dynamic DNS service so others can connect to your machine by name.
I haven't done this last step for about 12 years and don't know how it's done these days. Back then I could set it up with DynDNS but they stopped free hostname services back in 2014. Also back then I could use the built-in Dynamic DNS tooling in my router.
I know I used FTP to upload the pages, though I'm not sure which program I used.
And I usually used free hosting providers too. More like Tripod and my ISP than Geocities, but it was all usually the same anyway. No (or very little) support for anything beyond HTML, CSS and JavaScript, and the ever present ads all over the place.
If you want to simulate this setup with an M1 Mac, then probably TextEdit and the FTP program of your choice along with Neocities.
BTW, this text on the site is false: "Geocities has one big advantage - it has been around a long time, and will probably continue, so it won't suddenly disappear and take your web-site with it"
Also used to use a bigfoot.com "Email forwarding for life" which I assume stopped working at some point.
Developed and served from Macintosh: macOS, BBEdit, WebStar (the most popular Mac HTTP server), Frontier (for programming, app integrations via AppleEvents, and object DB), FileMaker for integration to business data.
Developed on Macintosh, servers by Solaris or Linux: BBEdit, MacPerl, (MacPython but that may have come in 1998), Apache with CGI to Perl/Python. I think I started this to use Postgres about then.
I played around with Microsoft FrontPage and NetObjects Fusion but always found my way back to HomeSite.
This was all running on Windows 95 on a home-built PC.
I would likely use Hotdog[1] for the website. It came in a CD with a Magazine and that is how I learnt what HTML was (before I ever got connected to the Internet).
Served via Apache on Solaris. `.htaccess` for fun and profit.
I'd use Notepad to write code. The webpages should obviously be designed as complex nested tables, with valid HTML 4 tags from the era (https://www.w3.org/TR/html4/index/elements.html). Throw in non-standard MS tags like
Setup Apache with cgi enabled. Use Vim to write your html and Matt's Script archive to get you started with some CGI.
This is what I did back in 1997 to build sites for businesses.
Your PC feels a bit underpowered, tbh.
IIRC, the typical off-the-shelf system (in my locale, at least) end of 1997 was more like 166 MHz MMX with 16MB RAM and up to 1GB or so of HDD space.
2. HotDog, Dreamweaver, and/or Frontpage for HTML editing. They each have their strengths & weaknesses and work relatively well together. Normally I'd lay out pages in Dreamweaver or Frontpage and then clean up the HTML with HotDog. https://en.wikipedia.org/wiki/HotDog. Upload them with BulletproofFTP (http://www.bpftp.com/), an ancient predecessor of Filezilla.
3. Not sure if any of the old hosts are still around. Maybe Neocities.org comes close to the Geocities of old, or else you can get barebones and very cheap but reliable web hosting from https://www.nearlyfreespeech.net/
Get AOL dialup: https://getonline.aol.com/dialup and a 56k modem off eBay, no cheating with megabit connections.
You can emulate old browsers online: https://oldweb.today/
People generally weren't self-hosting their websites back then because phone lines weren't very reliable, and the dialup modems of the day (like today's cable modems) generally had faster download than upload speeds. After the BBS era, commercial internet hosts started becoming pretty common, but they usually used expensive ISDN or T1 lines.
Apache was still in its infancy then, and Microsoft IIS was common.
---------
Overall, I don't think the network stack is really THAT far removed from what we have today. The basics of HTTP were pretty fleshed out already (it was more the HTTPS and DNS security extensions that really evolved), along with better compression and parallelization protocols in HTTP2/3. And of there's been a huge amount of backend server optimizations, caching, reverse proxies, etc. But the old network stack would still work today and you can still run a basic barebones http daemon and firewall the same way you could back then, open up port 80, and watch your box get pwned by the bots. Hell, there are probably still zombie bots leftover from that era running on someone's basement PC, just casually scanning the internet day in and day out, waiting for its chance...
What's changed a lot more (IMO) is more the language of the web itself. HTML was very basic then, as was JS and CSS. Making websites without modern CSS, in particular, is very painful... requiring a lot of nested tables, frames (remember those? they're different from iframes!), etc. A lot of the pain was abstracted away to backend scripting languages (Perl, ASP, Coldfusion) or frontend extensions (Java, Shockwave, ActiveX, Flash) and people didn't really work in vanilla HTML/JS very often (because they were so weak by then). Plain HTML is for your basic Astro Marmalade site with a bunch of animated gifs and whatever, but probably real ecommerce sites (such as it were) wouldn't be written in plain HTML :)