Let me provide a (frustrating) example: the last straw for me has been OneDrive. I am using it to select and share photos from my wedding. It is an app written by one of the largest and most ancient software companies in history, so they should know something about making apps. And still:
1) The directory list view keeps "losing" the position at which I am, so every time I share a photo, I have to scroll down to where I left (in a directory with 5000 pictures).
2) If I screenshare using the Google Cast functionality, after a few dozens photos it loses the signal and I have to wait a few minutes before reconnecting. The entire app becomes extremely slow in the meantime.
3) The app in general is inconceivably slow. What is taking so long? I am viewing the same directory for 2 hours, why is it still so slow to load?
So at this point I am struggling to understand: how comes such an app got released? Are the incentives given to developers so at odd with app quality?
Additionally, we as developers keep building software using more and more complicated tools that seem fancy and new to us, but are brittle and don't deliver good software in the end. We keep adding more and more layers of abstraction, both on the frontend and backend. Why? To put it on our CV. Things are moving so fast that we're afraid to get left behind. We're at a point where things just keep getting more and more complicated – actually keeping something alive (let alone building new features or making those features work) takes more and more man hours.
Sage Payroll across a network - even a 100Mpbs/switched setup was laughably slow because it would IO bytes at a time.
Symantec's antivirus would refuse to update, even on freshly-installed servers. Norton had official instructions on troubleshooting LiveUpdate which included having to unplug the modem to fool it into thinking you were offline.
Windows would hang, crash, run like treacle, die if you plugged in a printer, insert a blank CD-R that hadn't been formatted properly.
When iTunes first came out? Oh god, one early version broke the Windows Installer somehow, killing all MSIExec-based software in the same way.
IME, software is just as bad as it always has been. Microsoft still ignore their own best practices, design guidelines and even user consent recommendations. Electron may be new but it's as efficient as software has always been: insultingly, laughably so. Installing printer drivers seems easier until you realise, three months later that it's pointing to an IP address, not a host name, when it's given a new one. Forced to set up an online account to scan, or even just login to your computer.
I don't even have to work on enterprise software which is why I do not have grey hairs :-)
https://eh.net/encyclopedia/history-of-workplace-safety-in-t...
The sharp rise in accident costs that resulted from compensation laws and
tighter employers’ liability initiated the modern concern with work safety
and initiated the long-term decline in work accidents and injuries.
There was a long history of companies not caring about accidents and death in industry machinery. Companies claimed it was unavoidable, blamed their workers for doing this wrong, etc... Then the law made them care. The unsolvable non-problem turned out to be a solved problem in a few years. We also got some red tape as a side dish, but that's a good deal, all in allSoftware is the same: If organizational leadership felt immediate pain from sloppiness, that sloppiness would disappear quickly. But now, we have fallen in a point where software gives no warranty at all, users have accepted the low quality as a regrettable fact of life, and companies attribute their low productivity to vague nebulous unsolvable mysteries instead of sloppy software. So here we are.
Modern software is in fact more reliable than older software.
We got into a situation where people are so disempowered by design and poor access to routes to complain and demotivated to complain so poor quality is the norm.
If you want this to stop we have to collectively rip a new asshole in every half baked pile of muck out there loudly. Really loudly. Start burning up people’s ROI. Buy an app and it’s shit? Get a refund and then tell everyone everywhere exactly how bad it is.
When I do this I am repetitively told that I’m negative and this is not a productive attitude but I disagree and see this as a defence of the improper norm. The first step of quality is acknowledging you have a problem which needs to be shouted in most companies faces until they can hear through the fingers in their ears.
However, even worse is that the complexity of software has grown to absurd levels:
- Massive distributed microservice architectures for simple chat apps and news websites.
- Huge distributed teams that need to coordinate rapidly in an Agile environment (which was never meant for teams that size, which is why you have crap like SAFE).
- Team churn due to lousy pay without changing jobs every 2 years.
- Bad practices that are still taught as best practices to this day in university, e.g. OO-inheritance hierarchies.
- Terrible foundations, how do you make a native windows UI again? The web grows at a rate that's impossible to keep up but even desktop apps build on that these days because the OS APIs are so lousy (even Apple is starting to crumble there).
- Sheer size of software with shorter time to market, leading to lots of open source code reuse, which is a big pile of code that can contain a big pile of bugs you probably can't do much about. I work on a piece of software that has thousands of dependencies if you count them transitively.
It's all just... too much, it's too hard to keep up and shortcuts are constantly taken (tech debt yada yada). There's good software out there but it tends to be small, focused, slow to evolve and developed by a small team.
Quality is a cost, and users don’t generally pay for the marginal value of a less buggy app, they pay for the massive value of the categorical problem being solved.
You sat there for two hours viewing a single directory, clearly making the page faster doesn’t mean you’ll use their service more, so why should they make it snappier?
* Fixing bugs isn't sexy. Constantly pushing new features is. Tech debt therefore isn't managed properly. I am leaving a job right now because of this.
* Problems with performance arise because developers frequently only test the application on their machine and/or phone and developer typically will have nice hardware that will run the application quickly.
* Mobile applications only get tested in areas where there is a good signal and issues that occur when signal strength and/or bandwidth are poor aren't accounted for.
* Decisions on what get prioritised are based entirely on user metrics collected in app. Which means anyone that blocks this reporting or doesn't fit neatly into the most common scenarios will be left out in the cold or issues de-prioritised and possibly never fixed.
* Tech debt at these companies is insane. Your app probably has a large client and server side solution that probably takes months for someone to be proficient even in making the most basic changes. There will be 500 lines in each switch statement blocks, classes that are 1000s of lines long etc. etc. This is due to tech debt not being managed and people just hacking to get stuff done to sprint deadlines.
* There will be 1000s of long standing bugs that can only be reproduced on particular devices (which may or may not affect you). Frequently devs might not even be able to get their hands on the device to actually fix said issue.
* I would wage in some cases the info-sec team will have ridiculous hang-ups about plugging in a phone via USB to a PC / Mac to debug. Frequently issues will be debugged via archaic or jerry rigged solutions which seem ridiculous to most freelance / non-corp devs.
Please watch the [Preventing the Collapse of Civilization (1 hr)] presentation by him going into details why it is happening what you observe.
https://www.youtube.com/watch?v=ZSRHeXYDLko
Edit: Corrected the uppercase spelling of person's name.
The simple answer is that a lot of software is simply good enough. It's doesn't have to be perfect under all circumstances. And making it much better than good enough has diminishing returns because it requires non trivial investments in quality.
In this case, you pushed an app out of its comfort zone and ran into some issues. Simple suggestion, use something more suitable for what you are trying to do and accept that this maybe isn't what this specific thing was designed to do or even do well.
Software is built without thinking about whether will compose. How it will compose.
The elements that make things compose can be defined as the strange word compositionality. The "laws" that make things composable if they are followed. The properties that things can have to make them composable.
An API is more composable than a stateful mishmash of library functions. Now let's imagine that underneath the API there is a core data structure that has its properties and relationships clearly and usefully defined. Can we automatically generate the API code for it? We should be able to, at least to a significant degree. Then that's more composable. More compositional.
I think this is similar to video games - people see a pixelated edge in a game and think it is crap because it rarely happens.
Now if you remove "modern" from the question, and just ask how you explain the sloppiness of software, then all the same answers come back up. Today, where software development feels about 90% gluing stuff together, it's because there is so much stuff out there we can't even be aware of it anymore.
It used to be that a person could be read on nearly everything (100+ years ago), then you could do so within your field and be aware of everything else, then, for most of the 20th century, you couldn't know everything in your field, but you could mostly know it and be aware of everything else. Sometime about 10 years ago we hit an inflection point, where you can no longer even be aware of everything in your specialty. If you are a database "specialist", you can't even know of all the databases that exist, let alone understand them.
So weird, and so impactful to us and our society.
Anyway, old man rant over.
There isn't.
Building products is hard. Building complex products, like software tends to be, is incredibly hard.
If you ever want to read about how terrible pretty much everything is when it comes to design, read "The Design of Everyday Things". It's a classic for a reason. It shows how so many products that people rely on every day are just terribly designed, even something as simple as a door or a remote control.
Software isn't uniquely bad. It's just regular old bad. Incentives are, like in every field, to make money. Up to a certain level quality is hugely important, and after that level it isn't important (for making money).
I mean, the question isn't actually would you want this or that bug fixed or not. It's would you give up other pieces of software that are important to you, so that some bug that most likely doesn't affect you is fixed. Because that's the tradeoff. More quality = less products, because to a large extent it's the same workforce.
And btw, modern software is way better than most older software. Way more features, and largely more stable.
Witness the distressingly high number of devs that fizzbuzz catches out.
I don't know if they are in the wrong industry / career, are just generally incompetent, or could be good and just don't care, but it seems like the percentage of incompetents is rising.
I think tradition and collective experience matter very little with the issues you're describing. It is hard to form and keep a team that is both technically excellent and will make sensible UI choices.
Additionally, I don't know if this is true, but I get the feeling that pulling "product" and "UX" into totally separate professions has meant that ownership of the overall quality is now theoretically in the hands of people who can't ensure it. A similar idea to the principal-agent problem, although in reverse.
Institutionalized quality would include:
* Uniform minimum accepted standards of practice.
* Uniform foundations of minimum accepted product quality.
* Uniform accepted measures and metrics.
* Formal training dedicated to standards and software as a platform, not tools.
* Credentials to certify standards of practice and conformance to ethics therein.
Instead children are left to drive the bus and cry about how hard life is. Anytime this subject comes up there appears to be no adults in the room.
I'd say top 2 reasons are tied to either adversity in software engineering or simple economy. Devs are pressured to work fast, overtime, in understaffed teams, prioritizing features over quality. Once a project becomes old enough changes suddenly become much more expensive and issues start getting lost in the bottom of backlog. Large projects are chock full of tech debt, bad design decisions that used to be good at the time of writing, undocumented features, untested edge cases. IT workers constantly changing jobs means knowledge slowly evaporates. And lets not forget how fast tech becomes a piece of legacy and all the associated difficulties of working with old tech.
The economy part is simple. Businesses prioritize profit. If people can live with a few quirks they'll stay there indefinitely. If a company gets X cash for a project they'll maximize the profit by doing as little as possible while getting it done. If a product isn't really making that much profit it goes to maintenance mode sacrificing quality for a shiny new thing that will hopefully be more profitable. Projects once they go to maintenance mode only fix what's being payed for and that's the bare necessities.
And this is how we end up with the world's most succesful desktop OS having "Title menu bars that can no longer be used to drag the window", menu-bars consisting of ALLCAPS to hint it might possibly be a menu, a start-menu that you can no longer browse, where your installed programs never show up or disapppear, and where central OS windows no longer follow basic UI guidelines (goodbye incremental-keyboard-search, and the ability to sort and resize list columns), and hello scrolling lists without scrollbars, and goodbye indicator of how big this scrolling list is, and whether there are further items below to scroll into view. It is telling that Microsoft has not been able to produce a viable GUI framework/toolkit since 2006. Interestingly, WPF still sports 'the worst folder picker in the world', even though Forms has offered an excellent and superior folder-picker for around 20-22 years by now.
I'm on a MacMini from 2018 (I think) with 16GB of memory, on the latest version of Chrome, still can't believe Google "bug-regressed" on something as basic as online chatting through a browser. I was able to chat in a Netscape 4 instance on a 5x86 (or similar) more than 20 years ago with no such inconveniences.
Would you be willing to pay more - or pay at all - for a better solution? Or write own software?
People overall are accepting tragically user-hostile sofware. How many stayed on Windows after it got ads in the start menu?
-----------------
In case of OneDrive following things can happen:
- people escape to superior competing solutions - someone spots that noone provides high quality solution and makes own and earns money on that - people accept low quality - OneDrive gets improved
(multiple can happen for multiple groups of people)
-----------------
How much you are willing to pay for cloud storage?
How much you are willing to pay for high quality search engine without ads? (I would say that 20-50 euro per month may be viable for me if search quality is noticeably better than Google - I am aware about Kagi but not sure is it actually giving better results)
How much you are willing to pay for open data navigation not tracking you? (in my case I put significant effort into OpenStreetMap)
Note that for many people answer to all above is "nothing at all" so services provided to this people care about stuffing as many ads as possible rather than about quality.
-----------------
Also, software in many aspects is strictly superior that what was available in past.
Most of the software I use is great. Absolutely, mindblowingly great. This is in the face of seriously hard challenges with physics and mathematics. I get that you're having some weird troubles with Google Drive, that I've personally never had, but when I sit back and think about just how much software I use I'm astonished that I'm not frustrated more often. I send messages, they get delivered. I take a photo, it has crazy high quality for the size of the lens and sensor. I google a python question, I get an answer. I look for music, it's there ready for me to listen to it.
You have high standards, I get that, but software is hard. It moves so fast and even seemingly simple things are way harder than first glance. I don't know why the edge case you hit was so bad, but in general the GSuite has been great for me. It's always possible it was something weird like a browser extension.
Anyway, what I'm trying to say is that so much of software works well that you may have stopped appreciating it. I try not to forget what computers were like twenty years ago. It was madness.
Additionally, the Moor's law and the well-paid jobs of developers mean that the machines where the code is written are not the machines where it is ran with the difference often being a few generations of processors, graphic cards, and monitors.
No amount of discipline and techno trickery will get you to a happy user if you lack the capacity to feel what they may be feeling while using your software.
In most large organizations the breakdown is obvious. Too many layers. Opportunity to engage empathically is lost to some twitter support bots. Everyone on the inside is reduced to seeing some polished interior of a mindless corporate automaton.
Steve Jobs is an excellent example of how to cut through this problem in a large org. He reinjected the actual feelings of the end user from the top. Few other corporate leaders do this today.
In modern software, integration wins. Look at MS Teams. Complete shite. A buggy, bloated mess. But it wins because it integrates many different services together. Be honest, every time you have done any kind of integration between two pieces of software there have been hacks. But appealing to standards never works. Instead you are just told to make it work. Those who make it work are rewarded. Those who appeal to standards are sacked. You've basically hacked your train wheels to make them work on dodgy track and then wonder why the train derailed.
It's not all bad. From the TCP layer down we have all agreed to a few standards that work. It's all the stuff above it that breaks.
My other gripe is how, with increasing screen resolutions, UIs are eating more space totally unneccessarily. That Word print dialogue box is a good example - it must use 1/3 of a 4k res screen's estate only to display a dialogue with a button that is confusing, as it isn't consistent with Windows overall UI.
TDD/BDD proposes a world where all things work and remain working when investigated by an automated test. The automated test is built up step by step and happy to do everything from start to finish to look at the one feature and the feature may remain and continue to get repairs long after it continues to make any sense in a flow and with very little observation of how broken it might be in natural use.
Major software companies rediscovered that users will tolerate crap, so they've adopted processes and mindsets that serve the company at the expense of the user.
Also, in many ways the web sucks compared to native software, and the modern impulse is to web-all-the-things. So what was once a relatively slim desktop app is now some bloated website in a can.
The mean is now some shitty website or shitty website in a can. And no matter how good a software company starts out, it's run by businessmen and most of them will push it back towards the mean ("Why are we developing this fast, slim native app? We can reduce duplication and cost by shipping an Electron app that just repackages our web version).
Management is by now mostly "OKR: Objective Key Result" driven.
But the performance review mostly focused on the KR as its designed for measurbility, the O is an afterthought.
Should be the other way round. Isn't. I by now quite often (when the company works with OKR but the quality that comes out product wise sucks over multiple time periods) recommend to trash the KR completely and only focus on the O.
The complexity of the interactions you described is orders of magnitude anything software in, say, the 80s had to deal with. Just imagine how many technologies you are using to display photos from "somewhere in the cloud" into your TV.
Now we can discuss how much is essential and accidental complexity, what alternatives are there, and whether the tradeoffs are worth it.
Compared to the Windows Millenium of yore with its blue screen every now and then, Visual which would take forever to load, firefox would crash miserably every few hours.
I don't deny your experience but this is just anecdata. The whole tendency of software is to get more reliable over time. Even if the complexity can set reliability back sometimes.
The tools and frameworks add bloat. One some level we do this because the complexity of what we’re dealing with gets in the way of the problem we’re trying to solve. We add a layer of abstraction to hide the details and the solution becomes tractable.
We were supposed to be able to have our cake and eat it too but it turns out that a lot of these abstractions aren’t free in terms of performances.
The other factor is that a lot of developer tooling is designed for developer convenience. This is nice when you want to try out an idea. But the prototype often becomes the product. It’s nicer to work with, perhaps easier to add features to it, but that trade off is still there: performance.
And we’re taught to feel guilty or told we’re doing it wrong if we show any concern about performance. I’ve seen developers accused of the dreaded, “premature optimization,” sin for making suggestions in PRs to avoid code that would introduce poor performance. Or for suggesting anything performance related.
Lastly a lot of developers get to work on the latest and greatest hardware. They probably don’t spend any time or effort testing on older or low-tier hardware. This leads to designs that are “good enough” on these machines but will be slow as molasses on anything an average consumer would use. There’s a highly myopic view about platforms and is often not even considered.
That is, move fast and break things (cause it works for them) vs let's get this right (else we'll be targeted on HN).
In the case of Google, they don't really have an incentive to do better product. The products are simply a smokescreen for their search / advertising monopoly.
Furthermore, you and your One Drive, Cast, etc. are more of a source of data to be harvested than a means to bringing joy and satisfaction into your life.
I face the same thing. I'm locked into using a particular GIS software package for my work, supposedly a central system accessible from any device. However, in practice they have 3 different apps and 2 different web interfaces, none of which have the same feature set. And often I want to use data that's accessible on one of their platforms to do processing that's only available on another of their platforms. And there are backend problems for which I've periodically been sending in support/suggestion tickets for a decade without any fixes. However, their marketing feeds are constantly bragging about new features and integrations. Why? Fixing their broken shit is harder work for less payoff. So they just don't. And they've got the patents, so they don't have to care.
Why bother caring?
In free software, developers are users, so they care. Software is free, but developer_as_user time is not free, so it better to spent 1 hour time once to save 1 minute every time in the future.
In commercial software, developers are not users, so they don't care. Software is paid, so ROI is important, thus it better not to spent 100 hours of developer time + 1 hour of upkeeping time every month to save 1 minute of users time, unless it's important to crush a competitor.
Commercial producers are quadruple prices, until users stop to buy, drops quality, until users stop to buy, stuff ads in, until users stop to buy, shrinkflate, until users stop to buy, and so on.
Did you switch to a competitor product? No? Then why company should care? Vote with your money.
Monopolies like Microsoft, Google, Apple for stuff that have no real competition, so the manufacturers have no incentive to polish their products.
Linux for products made by volunteers that work when they can, how much they can and have the attitude "you get what you paid for, so don't complain too much". it is totally fair, but it leads to sloppy software.
Race to the bottom, like our MES supplier that fired most of their US based people and hired juniors in India; the quality went downhill, but the lack of better alternatives (again ... monopolies) makes it good for them, financially, for a while. Their CEO gets the bonus for the savings this year, not for the death of the company 5 years later.
EDIT: adding Internet, the way you can ship shitty software that you may partially fix later.
This is what ended up pushing me into the open source / free software world. As long as I need to deal with shitty software, I'd rather deal with the annoyance of not being cared about, than with the dark pattern galore of the proprietary world.
If you're the one guy who takes twice as long to write code, even if it works really well and needs little maintenance, you'll be the one who doesn't get bonus / gets laid off.
It's really a hard sell. I feel uncomfortable and I even try to do more testing by myself, but I fell pressure to deliver more features and scrum points than polishing some corner cases.
People do point out to me the pareto distribution (solve 80% of the ROI, forget the 20% that has less value and costs a lot), but it's stupid when your service scales to high thousands or millions of users when the chance to have edge cases poping up are increased.
Is food in the mass market fast foods quality food? Are mass produced electronic gadgets and toys of good quality? So on and so forth. The most amount of money is to be made in bringing mass quantities of shiny crap to the market.
Educated clients - the lack thereof. As opposed to the past where most software was for professional use these days the software that brings the most bucks is for mass consumption. And not very long lived either, people get bored fast.
As the economy changes consuming habits will change too, I think.
- The industry disincentivises people to stay long at any one job, encouraging loss of institutional knowledge, and discouraging possibilities of fixing issues that might require fixing over a long time period (years).
- Sales lead growth is a double edged sword, when you sell a piece of software without the features existing, which means you might have an impossible timeframe to complete those features without cutting corners.
1. Parallel programming is hard: Everybody programs for multiple cores now, but this makes the program architecture much more complicated and dealing with errors much harder. You have all kinds of threads or green threads doing things in the background, each of which may fail in various ways, and you to deal with this asynchronous command execution somehow. The indeterminacy of such processes is the cause of unexpected behavior that makes an application look unprofessional. As a drastic example, you press a button, it starts doing something, and then shows a spinning wheel of death until an automated timeout is met after 2 minutes.
2. GUI frameworks are much worse today than twenty years ago: For example, web application frameworks consist of layers of different programming languages and document formats (JS, HTML, CSSS, frameworks on top of that, and maybe another programming language connecting with all this mess) running in web browser (or some kind of half-baked emulation of a web browser!). It's a wonder these work at all. Desktop and mobile frameworks not based on web technology are often thin layers on top of SDL nowadays. This means they reinvent everything, and it's very, very hard to do this correctly. Even native OS controls/widgets often have problems, and these have been fine-tuned by Apple and Microsoft for decades. Your multiline edit field behaves weird? That's the reason why. Write your own multiline rich text editor with image support and you'll see why this is hard.
In terms of QA, it seems that once native user interfaces are given up, an "anything goes" mentality becomes prevalent. Maybe it's because companies think they compete with web apps (=even worse user interface) rather than native apps.
Tl;dr Programmers have been piling crap on top of crap for two decades now, and if you combine that with parallel programming and connections between multiple layers, it's going to be fragile and error-prone.
Yes.
To expand, as others have said QA is a cost centre, and they often reported things it wasn't clear that users would actually view as a problem.
By cutting that out you get two theoretical benefits:
1. The people actually using your software will be the ones telling you where the problems are. This helps you prioritize work.
2. You don't spend as much money on QA people or dev time on fixes, which means more on feature work.
Maybe a not so obvious side effect of agile development is that it sort of reorganized software development into feature-focused development, meaning developers' bonuses at a lot of companies are often tied to the amount of new features they implement, and there's almost no focus on maintenance or fixes.
The same thing happens in politics: The incentive is on shiny new things, rather than maintaining the old things everyone uses constantly and desperately need the repairs. It isn't until bridges start literally falling down that tunes change on this.
Basically, if your performance is measured by how many new features you're able to deliver on time, to timelines set by a manager or executive who also has a bonus tied to how many new features or products their teams were able to deliver in a quarter, you're not going to think about QA at all if you can help it.
- Profitability prioritized over user experience
- Design prioritized over functionality
- Never-ending push towards new (mostly useless) features
- Poor product specification for new features
- Insufficient involvement from customers/users in the Product life cycle
- Technical team not involved in Product decisions
- Lack of investment to reduce overall technical debt
- Disregard to system architecture and coding conventions
- Deficient onboarding/training of new developers
- Desire to adopt new tools/frameworks instead of proven/established ones
I remember installing stuff 10-15 years ago it would constantly fail with weird errors and bugs. Using linux was a real pain. Everything was much more monopolistic than it is today. You had to use x for that, you had to used y for this. While we still have a lot of issues today it is vastly better than it used to be.
I haven't had any machine crash with a BSOD on windows for years and on linux everything just kind of works and I have fewer and fewer issues. I can work remotely 100% of the time and do impressive things in the browser, stuff that was simply not possible at all 10-15 years ago.
I remember what a big deal Gmail was when it came out and today we take that kind of service for granted. Even with all the warts modern software has, it is still better than it used to be.
As we burned all hydrocarbons we eventually discovered the personal computer and the internet AFTER we decoupled our economy backing from hard assets.
Now that energy, which is just stored sunlight, is running out and so everything else will slow down and have to be optimized.
We are at peak EVERYTHING, both good and bad!
Personally I'm staying on HTTP/1.1, SMTP and DNS, OpenGL (ES) 3, SPI, JavaSE on server, C+ on client (C syntax with C++ compiler), vanilla HTML, .css, .js for GUIs for life.
Windows 7 is still the best Windows. Linux is still the worst desktop.
Hardware is NOT getting better, I'm staying on socket 1151 until they become too expensive to power!
Intel Atom as load balancer, Raspberry 2/4 as file/compute servers, Raspberry 4/Jetson Nano as desktop and Raspberry Pico as mobile communication is the only viable future.
Let's get busy and build a low energy future!
For instance, see Lotus Notes.
There does seem to be an increased tolerance for UI latency in _consumer_ software over the last decade or so, and I'm not totally sure why. Pretty much any electron app is laggier than pretty much any consumer app from the early 21st century, and people seem to be broadly okay with that.
Also, Microsoft may at this point view OneDrive as basically an enterprise thing, and you can get away with practically anything in enterprise software. Without seeing statistics, I would guess that OneDrive is not commonly used for consumer purposes vs the competition; it definitely _feels_ like an also-ran. Microsoft's enterprise offerings have always been pretty awful.
Practically speaking that's probably because nobody has tested the app with more than a dozen pictures in a folder. At a guess?
I think people should create a ticket and ask for support. If nobody reports, they don't know there is a demand for a particular feature or fix.
I saw a page talking about Microsoft updating Team to make it less bloated [1]. I think they have higher priority for popular product.
[1]: https://tomtalks.blog/microsoft-teams-2-0-will-use-half-the-...
But this is not a modern problem. I saw this from when I started working in software 25 years ago.
Pick any or all of them as fractional contributors.
Money. Every dollar of QA is potentially a dollar wasted on making something better than what customers can tolerate.
It seems to me that if somebody said ‘let’s make a program that can easily view 10,000 picture albums on a high-end computer’ it could be done. You would have to think through the data structures and apply the methods used in high-end video games.
It seems to me nobody is taking the problem seriously enough.
E.g. when you are using an app tied to a service you are already sold on, you cannot select an alternative app to use (because e.g. your fitness-chain only has their own app). Similarly, if you are a gmail user, most users are not informed enough to figure out they could access their mail with an alternative IMAP client.
Good software happens when you can choose between competing alternatives. But the world is currently filling up with siloed monopolies that don't have to compete.
I see something similar with christmas calendar candles. Every december, supermarkets stock christmas candles of horrible quality. They get away with it, because every shopper only needs to buy a single calendar candle, and every shopper is an inexperienced christmas-candle buyer. Meanwhile, the rest of the year, you can buy big candles of the same size, of excellent quality, just without the dates of december marked out. TL;DR: A lot of inexperienced candle buyers exist only in december.
It was not tested with that large folders.
too few developers enjoy their job
managers are incompetent but multiply like flies
Did you ever use Windows 95? Seems like the quality of now and then is a bit the same.
It's all so vague, it's all to easy to see one's favourite enemy as the culprit.
growing faster than competition
distribution networks
growth hypothesis > value hypothesis
Sufficiently Strong Optimization Destroys All Value
QA: For the most part QA always wants to do the right things and push but they are so low in say, it matters very little to none. And having worked at FAANG/M, QA's are wives/friends/no industry experienced hires as favors of engineers who couldn't spell QA if their lives depended on it.
ENG: they do what they are told and agreed to an absolute minimum. I.E. don't even bother with boundaries yet alone check corner cases and don't even bother with basic smoke/regression testing before checking in. They rather just break the build and spend 2-3 days on retraction than to spend the 2-3 days of being thorough, complete and working.
TPM/Prj/Pgm Mgrs: Meeting their own badly projected schedules and/or trying to meet unrealistic schedule driven by factor outside of their control.
Prod Mgr: Badly researched, designed and defined product. A good product doesn't need to be explained. It should be designed to just work. Thanks Steve Jobs for pushing that mantra. It's like products for babies designed by people who has no concept of a child, or designing Autonomous driving by a person who doesn't own a car or drives.
A clear example what happened to me today. I have Googl home thingy in every room. As a father to a toddler, having voice control at times is indispensable. Today, I tried to get a mini to stream music but it tried to stream a video stream and complained that I couldn't on that device, but it proceed to then stream the audio portion? And before I noticed that it was, I clarified my request by saying to stream music. So it did...BUT it was streaming the audio portion of the video it says it couldn't do and the new content. I tried everything under the sun to stop one or the other or all on every device...NOTHING. It kept playing. I was bathing my kid and had to dry my hands, get to a phone to manually pull up the home app and kill the streams. OMG. Now my toddler screams one thing and only one thing because that's all that is screamed in our house; "Hey Googl....STOP!"
There is no question the idiot who said, "move fast and break things" is being literally executed everywhere.
Love and hate with Tesla the same. For every update, there are 10 things they break/step backwards.
I have no solution and it's a mindset of next gen companies to give their users quality and quantity. I loved gadgets, tech and all things shiny shiny but the disappointed is almost unbearable.
PS> Now that we have literally an army of geriatrics with time and money. Someone should create the "Grandparents tested and approved" certification. If they can find what they need to do without intervention, you got gold.