Until recently, we used to do things like write cheatsheets and other help docs for our clients for tools like Google Analytics. This was all fine, and they were appreciated, as clients just don't know how to use these tools.
But recently, the rate of change has just made this untenable. I'd log into a tool like GA and the whole thing would be different. Not just the upgrade to 4, but then incremental changes there, too. So cheatsheets, training workshops, anything around support - just becomes untenable.
Another example: I log into Teamwork (my project management tool of choice) - and they're "retiring" the plan I've been on (and very happy with) for years. Instead I have to choose "Growth" and now my dashboard is littered with a whole bunch of stuff I neither want or need. Nothing is where I'm used to it being.
And: we do a bunch of work with Wordpress. The rate of change here is insane, too - every single update brings new features, none of which is documented, bedded in or understood. None of which can be written about, supported or workshopped.
And: Trello. It was fine. And then Atlassian bought it and it became this horrific behemoth of "features", all of which just clutter everything up, none of which seems to actually do anything useful.
And on, and on.
Is this rate of change supportable? Am I just too old? Help me put this in context, HN!
0) I am growing older and I may get more conservative 1) I used a lot of products that evolved to a local optimum but I see a lot of them being thrown back into a evolutionary state they already passed a decade ago. Maybe to eveolve better, but I have my doubts. 2) Everything is bloating now. Instead of a collection of good tools interacting I have now 3+ ways of opening an Excel-File someone shared in Teams. All of them are broken and Teams is broken, too.
I feel as if the excellent wrenches I have been using for 20+ years are growing tumors in the form of a can opener. All to please people who never used a wrench or a can opener before. And production of the original wrenches is cancelled. Over night.
Next time I need a wrench it may be made of felt, because fedoras are en vogue and the mad hatter has to sink venture capital into expanding his business.
(Funfact: In German you could make up the very valid and understandable, but still strange word “dosenöffnerförmige Tumore” )
#1) Software used to live on a disk you buy. That software doesn't change, until buy a new disk. Then it lived in apps, where you can update or not. Now often, it lives in the mythical cloud where changes can happen all the time.
Now, even on the disk front, things were always changing. Often for the better, sometimes for the worse. In the OS world Dos->Windows->Windows 95 were big changes, and the OS9->OSX change also huge! But now the changes are always.
#2) The entire software world is built on VC money. VC money is not looking for slow and sustainable growth. Or a happy userbase of 10k people. The VC world doesn't mind if 99 companies crash and burn trying to harvest the wind while building their sails if one takes off.
#3) Out of the VC/startup world, large companies must justify their existence and every team and every programmer on that team must justify theirs. No one ever stuck around by saying "everything is good, we literally don't need to do anything or acquire another customer, let's all cut out hours to 2 days a week, keep patching bugs and making security updates, be happy with our current level of subscriptions, and spend a bit of time in R&D to make sure we have other bait in the water too if for some reason our users stop liking this.
You know the concept - jobs that exist just to exist and pay for someone's living, though they produce nothing of actual value, and it takes effort to actually remove so no one does.
With software it goes something like this:
1) A product is built, something that solves a real issue. It's new, rough around the edges and has only the basic critical features.
2) Funding is found and a large team is hired to implement what's needed and fix all the rough edges.
3) The team comes up with good, new ideas that actually improve the product experience for everyone, so in the eyes of management and the investors they have merit.
4) After the first 10 good ideas are implemented, the next ideas are... not as helpful, but some users still like them and nobody wants to fire anyone who's been doing really good work so far.
5) Fast forward a couple of years, the team is huge and there is really no headroom to improve the product. Everyone tries to come up with bullshit ideas and force their ideas through, just to have something to show come performance review time. The users stay with their tried and true product, and they're so invested at this point that a few regressions make them bitter - but not enough to leave.
6) Rinse and repeat.
I've been on that team. The engineers there are often mentally older, comfortable, and smart - or at least were smart. Many of them are sure what they do is important. But they're wrong. They now hold bullshit jobs.
Back in the day redesigns were well thought out and kept as many of the previous design decisions as possible. Ever since things started moving to the web, this seems to have changed.
Layouts, methodologies, subscription model, payment options, login options, core application design, it can all change from one month to the next. Everything is constantly being A/B tested, there is no clear release cycle anymore.
Now that iOS and Android finally seem to have settled down on a design, Windows changes their UI style again. Web frameworks seem to stagnate (finally) but there is already a slow move back to integrated server side rendering frameworks.
Modern software development is based on "move fast, break ad many things as possible, get promoted or bought out by FAANG". The time of cheat sheets, manuals and curated workflows is over, everything is now SaaS/PaaS/IaaS/AaaS and the only way to use a computer is to constantly relearn your work flow. Reading about new features and upgrade documents is no longer optional, because the next update could substitute some of the old features you rely on with the new ones.
Your computer,tablet, phone or TV could update tomorrow and you'll need to learn the entire contacts manager or file manager or settings menu from scratch and there's nothing you can do about it. And those are the systems that undergo relatively infrequent UI redesigns.
There are some things you can do. Stick sith LTS software if you can. Stick with single purchase, self-hosted software if you can. Avoid anything with buzzwords ending in aaS on their homepage like the plague, and try to switch to something else when your favourite self-hosted tool switches to an aaS model the way Atlassian did. You'll still end up using tons of crap that switches designs because the design team got bored again, but at least it'll affect as small a part of your floe as possible. Oh, and consider disabling automatic updates until security patches get released. I'd be the last person to advice someone to skip security updates, but redesigns often come in small parts and rolling updates, and you can delay them a little bit if you skip the unnecessary updates.
Having stuff in the cloud, as a service, means you have no control over change. That is an unsustainable model for the end users. It'll take a while for the pendulum to shift back. Right now they can use the fact that you can't secure a Windows machine (ever) as a wedge to keep things managed... eventually this will be solved. (Capability Based Security) At that point people can run their own stuff again, and p*ss on the cloud.
You can still support it, you might even be able to make the case for bringing the stuff back under local control. It's just going to be a lot more work.
It's not your age, it's your intolerance of bullshit, that is at work here.
Good luck!
However, there is ONE important side effect that I have not noticed until now.
Most technologists crave speed. Faster processors, faster disk drives, faster networks, faster everything. Bottlenecks are our common enemy. YES. They are evil. I can relate to it because I spent a good amount of time in my career fixing performance problems for financial systems.
No one likes to wait for the computer to respond.
Unfortunately, this craving for speed (in technology) has quietly bled into other aspects of our living. People learn to speed read to gain more knowledge faster. People speed walk regularly (yes, I can also feel it in Hong Kong’s subway stations.) And the most crazy thing is: we don’t realize it until our body cannot cope with the demands of our speedoholic minds.
I watched this Carl Honore’s talk from 2005 (http://www.ted.com/talks/carl_honore_praises_slowness). 10 years later, it’s hard to believe that many of us (including myself) still get caught up in thinking “Slow is bad.” But no, there is such thing as “Good slow.”
You need to be patient in building a relationship; you need to have a clear mind in thinking strategically; and you need to be willing to spend time making mistakes in order to invent something useful.
So, please don’t let us, technologists, news or media slloowwlllyyy turn you into a speedoholic.
No politician wants to repair existing infrastructure (bridges etc), but every politician wants to build new stuff. Because new stuff is what gets attention. Same situation in software. If I make a particular old feature 3 times faster or if I fix a 3 year old bug, nobody is going to take notice. But if I add a new feature, it is going to be noticed, whether the feature is needed or not. Then quickly forgotten, only to move on to the next "feature". This is what happens when your bonus, promotion etc is tied to shiny new stuff. People do what they need to do, to get ahead :(
As I grow older, I am more and more appreciative of things (anything - physical or digital) that do one thing and do that one thing very well. When I was a kid, my dad had a bicycle. That thing weighed a ton, looked butt ugly. My family abused that bicycle to the max, and it just worked, with almost zero maintenance. Same with every household item we had. They were basic, but they worked flawlessly, for a long time. And the reason they worked well was the absence of useless, stupid features that nobody needs.
I don't know what the solution is. But I am just tired. This doesn't even take into account the shiny new, half baked, undocumented tech that comes out every day and gets adopted for no reason.
FTWA [0]:
> Alvin Toffler argued that society is undergoing an enormous structural change, a revolution from an industrial society to a "super-industrial society". This change overwhelms people. He argues that the accelerated rate of technological and social change leaves people disconnected and suffering from "shattering stress and disorientation"—future shocked. Toffler stated that the majority of social problems are symptoms of future shock. In his discussion of the components of such shock he popularized the term "information overload."
The unfortunate side-effect of this is that users will never become proficient in the software tools, with 'power-users' being a thing of the past. As an example, I think MS Office peaked for me probably around 2010 in terms of features and my proficiency and now I'm actually regressing. If I want to export a PDF of the file, I can't remember if I should go to 'File -> Print', 'File -> Export', or 'File -> Share', because it keeps changing.
https://www.amazon.com/Thank-You-Being-Late-Accelerations/dp...
For a while after reading this, I believed that maybe we were in a bit of a slump and the rate of change wouldn't continue to accelerate. Now, with the societal / technological change brought by covid, the recent wave of building and excitement around Crypto (NFTs, DeFi, Web3), the continued jumps in AI, and a possible shift from phones to AR/MR glasses in the next few years, etc it's hard not to feel that the acceleration is continuing.
1. You make decisions by aggregate telemetry. This leads to a degradation to lowest-common denominator, and changing UI to drive statistical movement.
2. You make decisions by intense user research with a small subset of key customers. This leads to "enterprison" disease and if done too early in a company's lifespan, can kill the company from ever growing beyond effectively being an outsourcing partner for their largest enterprise clients.
When you have 25 million users, you can neither have a conversation with all of them, nor can you please all of them, so you end up having to make the best decision you can based on the data available to you, and we all suffer in small ways, but hopefully benefit in larger ways.
It's hard to understand for those of us (I'm nearly 40) who grew up working with technology, because the changes have happened in leaps and then suddenly in tiny increments constantly, like boiling a frog, but along the way the number of humans using the Internet and technology massively massively massively increased. The scale of the Internet today is unprecedented for any previous product or technology in known human history. Fundamentally, that means everything must either specialize and focus on a niche set of customers or it becomes driven by the tragedy of the commons.
Maybe I'm just old and curmudgeonly, but it feels like ui design has become a cess pit of ever changing ideas.
It's also beyond frustrating to me when things that should and could work together don't because they're from different vendors ans we can't dear give users a good experience because that might allow other companies to exist and users to be empowered.
When I was younger, I was able to understand (and work with), a significant swath of the industry.
Shallow and wide worked for a long time.
That has not been the case, for many years.
Dependencies help to give people the sense that they are able to "keep up," but that's because they abstract the complexity. There are a few prodigies that can handle wide and deep; but they are rare exceptions.
I like to have a deep understanding, so I have specialized in native Swift development on Apple systems. I am proficient (but not a wizard) at PHP-based servers. I know enough to create backends for my apps, which work well. I know that others can do better.
This is not unique to software. All industries have been like this for over a century. Software had it easy. It was new enough, and small enough, that many of us could understand a great deal, about the entire landscape. This is how new tech always starts off. It used to be that every automobile driver was also a mechanic (out of necessity). I think the Wright Brothers were bicycle mechanics.
In all of your examples you keep mentioning that they either remove or change something you like thus forcing you to do something to keep up. That is the real problem, they are making changes that the user is forced to deal with. If all of these changes were fixing things you were legitimately annoyed with then you would be a happy camper, but that doesn't seem to be the case.
And there's another dynamics at play. Design community these days are infected with apple driven thinking, treating users as stupid and worrying about decluttering. The focus instead should be on how will a design scale for known unknowns and unknown unknowns? How will complex usecases compose well with simple usecases? What happens when an unknown error happens? These are stuff that nobody seems to worry about in the UI design sense, IMO.
Maybe the pendulum will swing back towards self hosted/installed apps. Alternatively maybe there's a good business model cloning the "good old version" of popular SAAS apps and keeping them unchanging.
However, this is (100-eps)% of the cases, demonstrably, worsens every product or at least causes its initial innovative aspects. The reason is that everybody thinks locally and makes the assumption that the function is monotonically increasing and when hit on a plateu or decline, make extreme changes to the product (and calling it disruption in the meantime). And much to my regret, management is one of the worst accountable professions similar to Human Resources in our era. If you listen to PM courses, cringe worthy practices are sold as success stories. On top of this, we have much too power with very little accountability hence following the entropy principle, much higher probability of doing the wrong thing.
Writing this while I desperately wait for Jira to render the page.
I started coding in the 80's, and got my first coding job early 90's, and the tech was moving even faster then.
I now have computers that last more than 3 years without becoming completely obsolete.
There's been one fairly consistent programming environment for at least 10 years now. We had "micro" in the 80's , Object-oriented in the 90's, internet in the 90's, and mobile in 2007. I spent less than 10 years making desktop applications in VB, and then both of those things stopped being at all relevant, while I've been coding web stuff in Go for 10 years now and it's still very relevant.
But I do find changes annoying now. I used to embrace it all, and be keen to learn all the new stuff. Now, not so much. I don't know whether I'm jaded with experience, or just old and want the world to stop changing around me.
In your org, do you give developers fat sweet bonuses for simply maintaining things, or more like for building new features? Because that's how those big tech companies operate.
1) Rate of change in fundamental technology.
2) Feature churn.
I don't think #1 is actually changing all that fast, compared to previous decades. Consider the period 1991-2001 and the period 2011-2021. I think technology change in the former was much, much faster than in the latter. A typical PC went from {486, DOS, 1-4MB RAM} to {Pentium 4, WinXP, 256MB-1GB RAM}. Linux had only just launched in 1991. ~Nobody had a cellphone in 1991. ~Nobody was on the internet in 1991.
But look at 2011-2021, and is anything really that different? Computers are faster, but it's nothing like the growth rate of the 90s. iPhones, Bitcoin, GPUs, broadband, cloud, Minecraft ... we had all these in 2011. They're just incrementally better now.
Fundamental tech is still incrementing but revolutions are few and far between.
#2, on the other hand, is in its golden age. And it's all for the wrong reasons, largely articulated by others on this thread. My addition: our ability to create new software has outpaced our ability to think of new ideas that are beneficial for users.
If your SaaS company is one product, then you HAVE to keep developing, and tweaking, and optimizing.
If you claim you're "done" you have to downsize to a maintenance skeleton crew.
So instead we feature bloat and everyone has to have this huge dashboard.
If instead we had something like Johnson and Johnson for software this could be avoided. They'd just make thousands of Saas products that do a single thing extremely well
But then I realized that everything you rely on is proprietary software which puts profit over user experience and therefore breaks working UX to get even more profit (something like this: https://news.ycombinator.com/item?id=29454289). Consider using free software alternatives instead [edit:] whenever you can.
Disclaimer: I, too, am old.
Things are changing fast because the challenges we are trying to solve are getting more complex and the tools need to be used by a larger swath of people in more subtle contexts.
This idea of "back in my day we had real wrenches" is exactly what you fear it is. Sure, you could build a house the same way they did in 1850, with crosscut and rip saws that you had to stop and sharpen every day, and nails you forged yourself. But today we need to 3D print houses out of concrete because there aren't enough trees, because all the old growth tight grain fir was clearcut over the past few centuries and even green wood grown to full size in 5 years instead of 100 is too expensive.
But I think the vision is faulty to begin with. When I started using computers in the 1980's for work, there were multiple text editors, and they all behaived differently, and yes, would go under or get upgraded. I loved working with PFSWrite on the Apple //e, but that eventually went under and we moved to IBM PCjr's running WordPerfect. It was a pain in the ass, why did they have to change my favorite text editor?!?
The pendulum probably won't swing for another 5 decades, when things become more "stable".
Also, I'm pleased to see so many people who are my age (mid 50s), and also kinda bummed to see so many people who are like me replying. Either it is endemic to us, or only we care about it and the diversity of HN is off on other discussions leaving us to kvetch.
The Basecamp founders said in a book or podcast that they intentionally keep old versions of their website UI around, for exactly this use case – to avoid forcing people to relearn a UI. I don't know if this is true anymore, so fact-check me on this, but if it is: consider giving them your business.
----
What's the underlying reason? I can only speculate, but having worked with product teams for a while, and gone through several redesigns, I suspect it has something to do with: new leadership coming in, and risk-adverse, unimaginative PMs.
New leadership because that's the #1 project to get the troops rallied behind you and to have that quick impact to put on your promo doc – get a new design out, make your stamp on the product, regardless of whether it's good or bad.
Risk-adverse, unimaginative PMs because a redesign is a safe product change, and similarly, a quick win to put on the promo doc. If you lack any ideas of where to take the product next, a redesign is an evergreen solution. You could spend time investigating customer's issues, having interviews with them, analyzing competition AND be "unproductive" while you do this (i.e. not have anything to show for it for a while), or you could be productive and do a redesign.
There may be other incentives that are misaligned, but those are the two I have noticed most pronounced.
My 2 cents. Curious if others have thoughts on how to solve it.
In general the software stacks are getting bloated, overcomplicated and new updates/features are coming out really supper buggy.
But the underlying approach of using cheat sheets to walk step by step through work tools is not a tenable one and isn’t for cloud-based software (i.e. software that changes without asking first).
Instead cheat sheets should be for core concepts (a purchase order has these components… a packing slip must be compared against the order quantity, etc)
That way you know what everything is, and even when the UI changes you know what concepts you’re looking for. If the UI is any good, it’ll surface the main stuff and tuck away the rare stuff, but it’ll all still be there (if it’s suddenly not, that’s when you have to complain or switch vendors).
When I was a kid my elderly aunt would ask me to teach her how to check e-mail or open solitaire by writing down click steps for her. Inevitably a week later and a month later she’d need me to show her again, because she refused to develop a mental model of what a “button” is, a “menu” is, an “icon” is, etc.
If by some chance an icon moved two spots in a menu, she was dumbfounded, frustrated at the audacity, and unable to complete her goal.
The former is somewhat inevitable I think, the later is often an unnecessary irritation.
Some UI redesigns are good & necessary but many seem to be just for the sake of it. A bit like if you're paying UX designers they're never going to say "yep it is good as is" since that would raise questions about why a company is paying X thousands to people not doing anything. So you get this endless stream of reshuffling existing stuff with no real value add and often a negative from the confusion. Which ironically is a pretty horrible user experience.
See also icon changes on phones. There too the changes are breaking a lot of the user experience (quickly finding what you're looking for based on familiar icons) for the sake of well not much value add. Every icon looking like a rainbow definitely didn't improve my life.
I don't even have an idea of where to go to find out. So much is unknown gestures, some activates by accident when I want to do something else.
Textediting in Notes is difficult after they removed the looking glass. I really want it back.
Well-defined UIs can help by making discoverability of features simpler, but more often than not, but I'm not convinced the industry understands the difference between "the UI is easy to navigate and discover" and "the UI has soft colors and rounded corners on everything."
I had to crack open Google Analytics to grab the tag to paste into something. It took me 2 minutes to find it. WordPress is more or less the same if you go to /wp-admin/, but the infiltration of the new UX into it tells me that's on the way out.
As these more complex technologies become more ingrained in our lives, people who aren't able to keep up - that is most people especially as they age, will fall behind.
In this case, I think the same principle holds: “software developers gonna develop”!
Meaning that I feel businesses would prefer to see developers working as hard as possible and just assume their output is always an improvement over the current product.
At the level of competing businesses 90% of effort goes to waste due to failing software businesses, as the winner takes all.
At the internal business and team-of-employees level, if time was spent it is assumed that it is always useful and pushed to the end users (instead of allowing the environment to kill the worst 90%).
The killer problem is velocity is the only metric that matters these days to the whipmasters.
We sacrifice quality for this and that’s what’s really hurting us now. No one is seeing this because we’ve forgotten that the status quo of continuous toil is not normal.
20 years ago I’d be shot for doing things that are normal now.
I'm with you on the "the damn ground is moving too fast" sentiment. My back-end of choice is Erlang-based, because in relative terms it's grounded in immovable bedrock, they fixed all the bugs decades ago and Things Are Not Changing. All my FE libraries are deliberately chosen versions and there is no ever-churning build chain to trigger a cascade of breaking updates. While this approach has worked for me well so far, I can tell not that many others appreciate it. How do you teach the value of reliable constancy to people enamoured with relentless change and tools for tools' sake?
I have recently been dropping software, subscriptions, containers and hardware like a good one. I have realised a lot of this stuff is just a giant waste of time. And time is more precious than ever now.
So the other day I set up an instance of phpBB for my own private use. A forum for just one person? Yes! I’m experimenting with using a completely vanilla install of phpBB without plugins or anything (did change the theme to one that I liked better than the default theme though), to organize my own notes because:
1. The bulletin board model of different forum categories for different topics make things organizable while not going overboard with the organizing.
2. The bump mechanism makes keeping track of current concerns easy by posting in the respective threads. Old stuff on little interes naturally fade into the background without any kind of manual effort.
3. It is searchable.
4. It’s open source and self-hosted, and it’s been battle tested for years. It’s far from perfect but it’s stable and I could probably run it for ever.
5. I’m making it available over my VPN only. I can use it in the browser of any of my own devices as these are all part of my personal WireGuard VPN. Meanwhile, because it’s not reachable from the wider net my install of phpBB is not gonna get trivially pwned.
The result - systems that used to be workable have become increasingly cumbersome and impractical. Don't add features, don't try to do more things - just do one thing and do it well.
And documentation!! I know that good doco is difficult (very difficult), but good doco is now completely unknown. Application doco now consists of a mindless detailed description about how to select a specific menu entry, or to enter data into named fields, with no explanation of the effects of those fields.
System doco is just as bad - almost exclusively automatically generated from function definitions. Any programmer can read the definition, can decode the type, so it's pointless duplicating that - just tell us the subtle details that are not clear from the defs or enumerated types.
If you don't see the value, as the customer, you are probably 100% correct in your assessment - everyone else simply is too afraid to mention "the emperor has no clothes" or isn't aware enough of what the big-picture goals and processes need to be to see that the tool is becoming Epic Fail. In general, most dot com 1.0 and 2.0 business models NEVER had the scaling to support sustained growth for more than a decade or two. MOST SHOULD GO OUT OF BUSINESS JUST ABOUT NOW.
I feel everything is changing too fast and I'm struggling to catch up. I support it and agree with it, it's just so weird to see billboards with NFTs (which I do consider scams) and Ethereum projects spreading so fast.
> And: Trello. It was fine. And then Atlassian bought it and it became this horrific behemoth of "features", all of which just clutter everything up, none of which seems to actually do anything useful.
Everything has to show hyper-growth all the time. A mature product or software package that works is actually bad because it's not showing a rocket ship growth trajectory. So anything mature has to be fiddled with endlessly in an attempt to squeeze more growth out of it.
Or maybe this is hell and we are being punished for sins in our past lives.
I don't want to re-learn how to brush my teeth right when I'm about to go to bed!
Maybe I’m jaded from being in tech too long?
My most effective way to combat this is coming up with creative dev projects to fill the void. A bouquet website, a website about local parks, a website that lets you browse campsites by following the trails like a MUD.
I used to feel guilty for not working my my main business 100%, but I have both saved my users from being annoyed as well as picked up knowledge I can use to improve my main site when the need arises.
At first you're excited about change because the status quo seem clunky to you. You thrive in change as long as you can, and then one day you start questioning if certain changes are necessary or not. And at some point you find yourself rejecting change, holding on to your favorites seem like the sane approach. But what you're holding onto is now clunky and outdated for the new guy and they are really excited to change things. And so the cycle continues.
The problem is the rate of breakage. Backwards compatibility is a forgotten art. Code is endlessly rewritten and refactored. Sometimes for trivial things like changing names. And of course, they don't keep the old name for more than a year, if at all.
Can't anything ever be good enough? Obviously things move on and new tech replaces old, but we don't need to break compatibility between versions of the same code for no reason.
This doesn't mean innovation should stop, but I think a lot of software doesn't need to change this rapidly to stay relevant. Docker is another example that struggles to find a monetization model because they basically solved the problem with the first version, and added tons of irrelevant features and services around it.
I'm 28 btw, having the same feeling.
Gripe: the Discord app feels too invasive when all I want to do is talk with my friends and play a game together, but it’s what works for us, and with so many people using it, that’s a lot of valuable data to gather in the interest of making the service even better, ideally staying just above the annoyance threshold of each subgroup enough to keep users hooked. The audio’s great, so I just use it in a webapp, thus also avoiding having to open a browser to update Discord on Linux when the auto-updater can’t do it.
Flood of novelty: I still get the little vestiges of a thrill when I update software. If nothing looks different, were there any changes made? Is this part of why we’re seeing so much change, along with trying to please everyone?
This is akin to a software based dashboards in a cars with no buttons, dials or knobs to use and just a touch screen. You potentially have to re-acclimate yourself with how to use something much more often.
It would be good to understand what is driving this but maybe its really just an era of change for the sake of change (aka Resume Driven Development also for product mgrs, etc.) I often have to reign in developers that just want to re-write things - its fun to do a wholesale re-write but very rarely necessary and with not much actual business benefit.
I wonder if this means there are opportunities for startups to build installable software and/or build products in a way that "locks" a UI/UX for a period of time as a benefit - it would make their product stand out.
On a related note, I'm tired of filling "stretch" roles (more than one role at a time) and being required to work in multiple stacks. Just let me be full stack in only one stack, please.
Back in the day when you shipped a game, you sure as %^*# made sure it worked because you can’t patch it easily or at all.
I think this contributes to it. Nothing is ever complete nor needs to be designed to be complete because the cost of modifying shipped software is lower than ever.
This is hard to project, as there is no definitive target, most of the changes that you describe are more like operational in their nature. Basically, it's an equivalent of planned obsolescence.
Most of people can still operate hand screwdrivers, and have the job done, yet their preferred tool now may be powered.
There are benefits, of course, and most don't bother noticing these transitions, as they may be occupied with higher level of problem solving.
Another factor is that in software there's no really "ideal" form. Same material ideas can be shuffled ad infinitum, as in collidoscope, creating different feature combinations and looks. Perhaps, you did get to see quite a few and no longer feel excited with the rolls.
But even for younger folks there have been many transitions in short time, like the fast pace of smartphone changes just in a less than 10 years. Maybe younger minds are more adaptable to quick changes.
Also, the default for most tech companies is that you become obsolete by default so things don't always get changed "for the sake of it" but because if you don't, before you know it, your competitors all look much fresher than you do and you have 5 years of work to catch up.
Another issue is related to direct competition with competitors. Your customers are less likely to be loyal because SaaS provides quick and easy onboarding to another solution if the current one doesn't cut it. We have plenty of customers complaining that we don't have feature X and they will leave if we don't implement it. So we do!
Users, like you and me, are not considered anywhere along the way. As a result, all of our job descriptions now including managing this ever changing infrastructure of red tape on top of everything else we've got to do. A little bit of this is OK and expected, but it sounds you're dealing with more than a little bit of it all at once, to the point where it's not sustainable.
If we're in that position, it's up to us to have the confidence to say to our manager, "OK, I can do this, but it's going to take (this much) longer because I have to deal with this other stuff now, too."
Browser vendors (controlled by the same huge companies) blocking websql and poorly supporting things like pwa are other examples of this issue.
Specifically, some people have that capacity and also do not care about whether other people can keep up absorbing it.
Another fundamental reason is that creating change is profitable. Corporations are penalised for making stable environments by competition that creates new things, faster.
Next level from that is corporations focusing on rate of change (that's what Agile is for, folks). Now corporations are less competing on their product and more just on how fast they can react to whatever competition is doing.
You see where all this leads. We users are all just a collateral in the struggle.
I would prefer more stable operating system than couple of refreshments on Windows or MacOS UI. But that's not how Microsoft or Apple think.
So, when we have these systems/products and the constant need to improve/upgrade we are endlessly adding more things to it. There are multiple motivtations here, one is by adding more features they can charge more money. Or they are looking to integrate into existing products or cover new markets. In the end the users are going to suffer the cognitive load.
I don't think its' the age or the person. It's the tools which are becoming overly feature rich or I would say, feature greedy.
I don't know if it's just me, but I'm seeing a lot of urge to change everything even in more conservative big enterprises, which used to be the safe space of sorts for people who wanted to build products in stable technologies, maybe not so hot but still really solid and well-functioning. I can understand if enterprise desktop applications are seen as obsolete and everything has to be web now (to me it still sucks, though, both as a developer and as a user. I assume I'm obsolete as well), but since a few years ago, now everything has to be Everything as a service, and machine learning must be shoehorned in some way, no matter whether it's well suited to the problem at hand or not. It's being so bad that higher management starts demanding rewrites of 100K-1MLOC applications in new languages, with every component of the full stack being new, because trusted technologies are apparently not evolving fast enough.
As I've said above, I assume that I'm just obsolete. But I'm pissed off at the current state of the industry, not only because I have to switch jobs for my mental stability, but also because I used to be at least partially in sync with many aspects of the industry, but that has stopped in the last 5 years or so. I don't see myself being a developer for too long... it's too much bullshit and I have non-software skills that I can put to good use in a job that doesn't make me insane, even if the pay is lower.
Churn is pointless changes. Changes should improve things, not provide job security, offer design novelty, or waste time or money.
There's no direct metric capturing UX stability, and I've never seen a real attempt at measuring what rate of UX change is sustainable. Bundled together with the PM career pattern of "add one big feature that has a measurable impact, then get a new job," we end up with an unstable stew where nothing works like it did last week.
(PS this is the bullcase for a good low-code tool – saas cannot be trusted and is going to exploit us until programming gets simple enough to bundle all this back into a networked graph spreadsheet for precision data MVPs. So we can all just get our work done again. I'm a founder in this space: http://www.hyperfiddle.net)
Maybe products like Teamwork could actually "SELL" the "Feature" of not changing things. Like a promise the UI won't update for 5 years. Or you will always be able to use the "classic" interface, and only progressive enhance your environment as you choose.
I feel like Wordpress does a pretty good job with this. We don't usually have big changes forced on us, and some of our sites are running really old versions because people like it that way.
On the scale of human civilization, software is a new born, will we have this level of change in 100 years, 1000? Doubtful. We are just waiting for the equilibrium to occur, and until then, enjoy the chaos!
Very few companies want to lay off engineers and product managers (understandably so - talent is hard enough to recruit and retain). So these departments just keep producing more and more features and updates.
Truth be told, the software industry is a lot more like the old-school corporate bureaucracies of the past than anyone cares to admit.
IMO now we're going back to feature saturation. Notion for me is the perfect example - great piece of generalist software that doesn't do any 1 thing particularly well. New GitHub issues too & the new Trello features too. Jeez, even Basecamp nowadays.
For me, the problem is that I don't have the time to learn that I used to. My job has so many other aspects that I no longer have the six-hour blocks to learn new stuff.
- “Can I walk out of this with my content/data and move elsewhere?”
- “Do I really need the data/content when I want to move out?”
And I try to stay as closer to the ones where I can move our fast without heartbreaks. There was a good gesture in the industry for OGs, "grandfathered" and could use the same thing for many many years to come. These days, they don't even care.
Why did a standard like RSS get expelled from client-server web platforms? It makes little to no sense. The knowledge pyramids that are growing every day are anti-learning, so we need to continue to push for a distributed p2p web.
The growing p2p space is an inspiring counterforce to the dominant venture capital client-server web.
https://gist.github.com/sleepyfox/a4d311ffcdc4fd908ec97d1c24...
G underestimates the cost of change to consumers, has a legit need to add features to their product, and no internal management ability to globally rate limit product changes.
Also, faster dev machines & better tooling make iteration faster.
There's also a cultural shift from shipping mega features into small incremental changes which is good for developers (less merge conflicts) and QA (incremental limited testing)
When you start, you learn what is there and it's all new and exciting. While at 50 you must also unlearn some of what you know, for seemingly little gain. That gets boring.
Do you remember back in the late 1990s, when you were just starting?
In 1998, Cusumano and Yoffie coined the term "Internet Time" to describe Netscape. Here are some quotes from http://edition.cnn.com/books/beginnings/9811/internet.time/ :
> The conventional wisdom about competition in the age of the Internet is that the business world has become incredibly fast and unpredictable, and we need to throw out the old rules of the game. ... After more than a year of intensive investigation, we are inclined to agree with some (but not all) of the hype. ...
> For us, competing on Internet time is about moving rapidly to new products and markets; becoming flexible in strategy, structure, and operations; and exploiting all points of leverage for competitive advantage. The Internet demands that firms identify emerging opportunities quickly and move with great speed to take advantage of them. Equally important, managers must be flexible enough to change direction, change their organization, and change their day-to-day operations. Finally, in an information world where too many competitive advantages can be fleeting and new entrants can easily challenge incumbents, companies must find sources of leverage that can endure, either by locking in customers or exploiting opponents' weaknesses in such a way that they cannot respond. In short, competing on Internet time requires quick movement, flexibility, and leverage vis-a-vis your competitors, an approach to competition that we define later in this chapter as "judo strategy."
Sound familiar?
This of course lead to a lot of use of the phrase in pop culture, and counter-arguments, like Demming's essay at https://dl.acm.org/doi/fullHtml/10.1145/504729.504742?casa_t...
> One of the most common buzzwords today is "Internet time." It describes the apparent increase of the pace of important events that we experience with the Internet. Developments that used to take years, it seems, now happen in days. Competitors pop up by surprise from nowhere; it is no longer possible to identify them all and monitor them. The now-widespread practice of email has simultaneously improved business communications and become a burden for many. Many IT practitioners, growing weary of spending two or three hours a day keeping up with the many dozen arriving email messages, complain of "information overload." Like most buzzwords, "Internet time" and "information overload" contain important seeds of truth while masking misconceptions that lead to ineffective actions.
> Andrew Odlyzko debunks a key aspect of Internet time - the notion that the Internet has sped up the pace of production and adoption of new technologies [3]. He offers example after example of new technologies that have taken just as long to diffuse as their predecessors in previous decades. He concludes that the most cited example, the Web browser, is the single exception to the rule. He claims that belief in the myth comes from a misreading of transient phenomena and from business hype.
Time to rewatch Koyaanisqatsi: Life Out of Balance. :)
Another example is a earthquake. Stresses build up slowly but nothing happens for a long time. Then all of a sudden, things snap into a new position and the whole tension cycle starts again.
For profit companies and employees have different incentives.
Constantly borrowing from the future is something people got used to.
Secondary driver — need for reinventing the wheel with each new generation (people want to be proud of their own achievements, not bowing to the fathers).
(and don't (want to) think about consequences)
(("we" as humanity. As this trend is not only software, every thing and your pencil is pushed into a new version, for the sake of it. OverTheAir, OutSideYourControl, Etc))
The basics haven’t changed much, and can do most things.
The open source power tools are rock solid for years and years. I have a setup 100% cli based that hasn't changed much in 10 years: mutt, offline-imap, notmuch-main, vim with plugins for what you need (wiki, calendar, todo, etc), git, python, etc. Everything in i3 in Linux.
I think I spent hundreds of hours refining this setup. I am sure somebody will give me the comic strip from xkcd for this BUT: I am happy with my setup, I love tinkering with it and optimizing it. And my results at work reflect this, I never have to search for a file, an email, etc during a presentation or anything like that.
While other people have their Windows desktop covered with folders and then some and sweat a lot fumbling on their work flows. Meaning they have failed by their standards. It's the result that matters, remember that. If you're not happy with the outcome you must be doing something wrong.
Makes me love the command line.
It's not and it severely hurting all production environments that need stability and security.
From a narrow viewpoint, perhaps coding is blossoming, it's seed spread to the wind to create flowers across a whole field. Or is it a mirror, where once it was a plane, we now have shattered hundreds of tech pieces and distributed them.
Yet, broadly speaking, things need to change faster if we're to mitigate climate change, disinformation, corruption, and all the threats of 21st century life. I will happily rewrite documentation and relearn methods if they're shown to be safer, more efficient, and more expressive.
life evolves, and we do too, our habits too, our needs too, we need to adapt to the environment, we need to adapt to weather, we need to adapt to scarcity, we need to adapt to people's mood
you are selfish if you expect things to just stop, and if you don't adapt, nature told us that you'll go extinct