i) The demographics of the user base. Maybe "Young Hacker News" happens inside discord or whatever.
ii) The fact that the past keeps growing. The body of interesting developments keeps growing and while some stuff is obsolete / nostalgic, other stuff is totally not. In fact revisiting the mindset and vision of past decades is many times very inspiring and educational.
iii) The fact that the present is shrill, exhaustively hyped, manipulated and in a sense damaged. Tech is no longer a force for good and we long for the time when this illusion was available.
And getting worked up about all the "new and exciting!" is an outlook best suited to being a simple consumer of entertainment. Vs. if you're no longer a kid, in a world that has some rather dire social and sustainability problems...then whether you're trying to understand the big picture in order to do something useful about the problems, or just to lose yourself in nostalgia for a bit, the past gets considerably more important.
It's a modern phenomena to disregard everything that has gone before as "outdated". The tech industry has had a good long run of not looking back, but we are far enough along that a lot of good ideas that have been abandoned or left fallow are sitting around waiting to be re-implemented or reinterpreted.
Two examples I'm involved in:
- https://htmx.org is building on the idea of hypermedia and has a distinct 90s flavor to it
- https://hyperscript.org is a re-imagination of HyperTalk, the scripting language from HyperCard, for the web
I like the idea of building organically on the ideas of the past. Lots of people way smarter than me just happened to be born at a different time in the arc of the computer revolution. Why wouldn't I look back at what they did, play with it and maybe think about how it might apply to today?
Everything has shifted towards planned obsolescence and I barely spend money today when I make way more than I used to when I first started working.
I got a Tesla and it was built crappy and interior kept breaking. A purchase should make me happy but I felt like my dollar didn’t go far enough for how much I paid.
I balled out on a $900 iPhone and it cracked on the first drop. Since then I’ve only purchased used iPhones for <200 bucks that I don’t care if I break.
Almost every single video game I have tried in the past few years crashes or has bugs. The last game I played was Cyberpunk and I even put a new computer together to play it. What a nightmare lol.
Everything is subscription based now. There’s no support unless you pay even more. And if you use it for a while and then want to use something else good luck because most of they make it hard to migrate.
Maybe I’m just getting older but there ya go.
While A.I has exciting properties, there's not very much really interesting stuff going on.. Mostly new libraries, written on top of older libraries to abstract some abstraction into something even blander..
When was the last time a new kind of hardware device was introduced ? I don't mean some iteration of an old one, but a new one ?
Like, when we switched from working off of floppy drives to hard-drives, that really _changed_ something.. When we started attaching modems to our computers and went out into the world.. When the voodoo chips came and we started having dedicated pci cards just for drawing 3d graphics!
There is a bit going on with VR headsets, but they're not showing the same kind of promise anymore..
Software is becoming scarce.. We sit with the most powerful general purpose computers in the history of mankind in front of us, and we use them to view "web apps" crawl forth with abhorrent efficiency compared to the wares of yesteryear.
So, maybe some of the nostalgia is not so much because we want to look back, but rather because looking forward is frankly a rather bleak sight.
So it's much more fun to talk about the different ways, say, 1980s game consoles handled graphics – each had unique chips and unique ideas, but all simple enough to be explained in a single blog post – than it is to talk about how all consoles today use the same hardware and the same software stack to achieve the same mind-bogglingly complicated results that require years of study to understand.
Same with OSes, cars, cameras, you name it.
Now, the marketplaces are largely commoditized and new products are incremental improvements and extensions of previous ones. The old ideas and products are still there, fascinating history to be explored.
An example - on a new Linux system, look at the man page for "termcap", and you will find configuration data for the Teletype model 33 and the Lear Siegler ADM-3. That makes people like me go "Hmm!" in a fascinated tone of voice.
History is like dead reckoning navigation. We only understand where we are by knowing where we were and what was done to get from there to here, and there are those of us very interested in such things.
This is only a partial answer to your question.
I didn't quote the surrounding context from the essay, which itself is quite fun; do read the full essay if you can.
https://medium.com/@lindynewsletter/is-culture-stuck-eb31712...
Perhaps the same is true of HN / tech communities more generally. Tech as a sector was much smaller a couple of decades ago. And many of us probably only got our first computers in the late 90s early 2000s. I'd argue it was until the 2010s that the internet really become ubiquitous. I'm guessing a large segment of us have only recently been able to talk about past technology in any meaningful way.
I think this accounts for a lot of it.
You then got ahold of a $3K Apple ][ computer with 32K of memory and integer basic, and the only way to get performance was 6502 assembly programming. A struggle.
Then PCs came along, where nobody ever needed more than 640K memory, so it was a struggle to partition your work to fit it all in, but at least you didn't have to do assembler anymore since there was FORTRAN or Pascal.
And then the barriers lowered even further with GUIs and more memory, but you still had to worry about true multitasking not really being implemented correctly, and your poorly-written program could still hog the cycles. A slightly lower struggle.
And now there is more memory than you know what to do with, unless you are a rocket or climate scientist. And you infrequently have to worry about performance. Code efficiency? Bah!
It's all about the struggle.
Soooo much in hipster tech is reinventing the wheel all over again.
(Why culture is dead is another topic…)
With past, you know what it was, you have a romanticized image about it. At least I do.
The past is attractive because we were different then. We felt different, and we thought different. We like who we were then more than we like who we are now. We liked our future then more than we like our future now.
While the technician was manipulating the thing I was surprised how badly its designed. How fragile its whole "state" and memory and how easily you can get it to reset itself or to lock yourself. This is, supposedly, an important thing and it can be critical (no technician that night? You'd be locked out of your apartment until morning. Good luck figuring it out.) What if it happened when you took the garbage out and your wallet/phone are not with you? Well, you are in a weird situation. I take my wallet and phone now to throw the garbage out. This is a regression of living standards.
And I'm a techie and I'd love that everything become IoT or "smart" but it has to be done right. The developers have to consider the criticality of the tool they are building and designing. Unfortunately, today, this is not taken into consideration. There is a blatant disregard of any of that and instead you have a focus on aesthetic, advertising, marketing, short-term profit, scale, reach, data collection, etc... But real, reliable functionality is thrown out of the window.
The system got to be breaking at some point. I think we are at that point and a big re-structuring of the tech sector will eventually happen.
I don’t want to focus on the past, because the thought of the future, of making up for and crushing that sucky past, was what kept me going. But what else is there to focus on? Focusing on the future isn’t really an option when you are in this situation. I am not even going to have the money to go to the gym anymore soon. I might be living on the street by then. Focusing exclusively on the present will mean breaking myself mentally staring at the screen, simultaneously praying for and dreading the next interview that comes. If I break down now, I eliminate any prospect of escape. So I have to do whatever is necessary to prevent that, however counterproductive in other areas it is.
Although my childhood is not particularly trouble free as my parents are very strict with me, I still turn my thoughts back from time to time to temporarily escape from all the hassles I brought to myself.
Later, he talks to someone else about it, who points out that Paulie has nothing in his life now, and that Tony should just accept it if he wants to dwell in times that were more meaningful for him.
Marshall McLuhan had something to say about the phenomenon of how the emergence of new media drives society to retrieve remnants of the past…something new influences us to seek out something old.
Or something like that.
But yea, there’s something going on.
The cold, pale, glare of tokenized elucidations from large language models.
YC and the general startup culture has been to do quick MVP's to validate product. This is great for making companies, but these companies have been more about reducing friction than to create something inspiring.
And maybe people have become jaded by new startups becoming successful with boring products. Dropbox is a great company but it's not really an inspiring product. Nor is AirBnB. Or Replit.
On the other hand this has lead to people also being skeptical about actual inspiring companies. I don't have the link here but the amount of negativity about Boom was disappointing.
If you don't want HN to do old stories, stop wearing flares!
And the ones cheering it on here are the ones that still believe deeply in the power of the market and capitalism (E.G. if you're not good enough to have a job then you're not good enough). Even if it doesn't work for them, it might someday so better not have any worker protections in place.
kidding aside though theres really just a lot of good things to talk about in the past, so why not haha
Still, speculating is irresistible so here's my 2c. It's because tech is relatively stagnant at the moment.
i. A lot of traditional "hacker" stuff is about how computers work/are designed, but innovation there has largely stopped. If you look at the newest operating systems (Android, iOS, ChromeOS, Fuschia), they're all extremely same-y. The web meanwhile is kind of more grown or discovered than designed. To the extent these platforms do things different from 90s era operating systems, they're trying to make things less hackable and powerful in order to increase robustness and simplicity. Three come from a single company (or four if you count the web), and reflect the mindset/culture of that company. So people who hanker for the hacker spirit look backwards to an era when how computers worked was still up for grabs and was at least some interesting competition of ideas.
ii. PL theory is likewise kinda stagnant. HN gets excited about Rust partly because it's the only new language for a while that has new ideas in it. Go, Swift, Kotlin etc - all nice langs but very much of a muchness and avoid new ideas (maybe Swift gets some credit for trying stuff with strings). Compare this to the days when C, Lisp, Objective-C, Haskell, Perl were duking it out. All very different paradigms and approaches. People who get absurdly passionate about PLs are kind of an anachronism these days but were once common because there used to be more at stake.
iii. Hardware: also stagnant outside of mobile. This is mostly the fault of Microsoft and Linux vendors IMHO. The mobile industry has shown that it's perfectly capable of doing cool stuff with hardware over the past 10 years, with folding screens being one of the latest gizmos we get to play with, but how much of that stuff made it to the professional workstation-class machines we actually use daily? Almost none! Even hiDPI screens are a trainwreck. The problem here is firmly in the software camp. Microsoft checked out mentally years ago, Linux is hostile to HW vendors doing new things because drivers are such a PITA to ship out of the box, and PC/hardware vendors can't do anything meaningful with Windows other than try to optimize cost. So the landscape consists of Apple porting HW upgrades from their mobile R&D, and that's about it.
iv. Web ideology has been fading with time and Google's dominance. HTML was never going to win any design awards but for years many hacker types were extremely loyal to it, perceiving it as an open system unowned by any particular company. Nowadays the loyalty is less, for various reasons. It's now OK to criticize it, it's not taboo anymore. People can look back on what it displaced and compare more rationally, hence the semi-common "Delphi was cool" type comments. This is a healthy pushback against the often irrational collective progressivism of the first decade of the 2000s, in which every "open" platform was cool, brilliant and the future, and everything privately owned was dangerous hot garbage. Sometimes I personally feel the pendulum has swung too far, like with the people who celebrate the inability to install non-Store apps on iOS, but there's no question that the pendulum has swung.
Transformers are going to create huge shifts in how businesses work on a fundamental level. The stable diffusion and LLMs could collectively eliminate huge numbers of jobs.
The past doesn't have the same worries, and this tech revolution is not looking good for many.