So far as I can tell the problem started around the time Win95 came out and also affected MacOS Classic at around the same time. It did not seem to affect minicomputers like the PDP-8, PDP-11 and VAX. It did not seem to affect home computers like the Apple ][, Commodore 64, and such. I don't think it affected PC compatible computers running DOS. I don't think it affected Sun or AIX workstations or Linux machines running X Windows until KDE and Gnome came along, now it does.
(I think it happened around the time that most GUI systems started to incorporate distributed object models and RPC mechanisms like COM, XPC services, DBus, Binder, etc. Certainly all the systems that annoy me like this have something like that.)
Reinstalling the OS seems to help for a short time, but as time goes by the length of the honeymoon seems to get shorter and it is faster to get slow again.
Back in the day it helped to defragment hard drives but we're told we don't need to do this in the SSD era.
If I had to describe it, it is not that the machine gets "slower", but instead it has periods of being unresponsive that become longer and more frequent. If it was a person or an animal I'd imagine that it was inattentive, distracted, or paying attention to something else.
If you benchmarked in the ordinary way you might not see the problem because it's a disease of interactive performance: a benchmark might report 10 seconds on either an old or a new machine: it might take 0.2 sec for the benchmark to launch on a new machine and 2.0 sec on the old machine, but the benchmark starts the clock after this. You might need to take a video of the screen + input devices to really capture your experience.
Any thoughts?
Computers were slow in the 80s/90s, so if you wanted your program to run fast, you had to know how cache and RAM worked, and utilise what little power was there to the max. As computers have gotten faster, that knowledge has gotten less relevant to the average developer.
There are some fields where that knowledge is useful, like embedded systems, HPC or games, but they are a minority.
I won't deny there are some benefits in not needing to care - you can write programs faster (even if they perform slower), and some classes of bugs can be reduced (e.g. memory bugs). These are good things, but fundamentally our programs still move memory around and interface with the real world.
As more people forget (or aren't taught) to keep this in mind, we'll lose the ability to make things fast when we need to, and our software will get slower and slower until it's unusable.
On a related note, we're also not allergic to complexity the way we should be. The average developer doesn't have a problem using hundreds of libraries in a single project if it makes their job easier. Unfortunately this also makes programs slower, harder to debug, and increases the surface area for a vulnerability.
Differently from high CPU usage or network problems, this can potentially affect any memory operation, so even carefully designed GUIs will freeze.
This effect will not happen in systems with sufficient RAM, or that don't swap memory to disk. In that case it'll either crash the program trying to allocate, or a random process[2].
[1] https://en.wikipedia.org/wiki/Thrashing_(computer_science)
* regular software updates -- more features and larger codebases running on the same machine
* larger quantities of software running at the same time -- on older machines multitasking either wasn't as common or wasn't even possible, and as time went on more software became available as did the ability to run more of it at the same time.
* the internet -- pages on the internet get larger and more demanding all of the time, while the capability of any given piece of hardware stays constant.
* active cooling and thermal throttling -- hardware can actually run slower if it gets dirty
Silicon aging contributes some: https://en.wikipedia.org/wiki/Transistor_aging
This wouldn't have affected your Apple II and PDP-8's, due to the huge transistors in those things (relatively to modern integration scales), and not running at anywhere the modern clock speeds.
2) local state grows and evolves in complexity. configuration databases grow, new files are added to the filesystem, software upgrades leave unused state around (because getting software upgrades right is still a not-totally-solved problem), indexes (or similar) of mutable state grow but sometimes don't get rebuilt causing locality to suffer affecting cache performance.
3) psychology.
As for android, I have experienced this a million times also, but do not know the cause. But I do usually see an accumulation of new and stupid bugs as I get software updates. Incorrect app scaling and positioning, etc which were not the case for the first six months I had the phone.
- Windows update can take a lot of blame for this. Basically the longer the Windows release has been out, the more windows updates there are. After every boot one of the first things windows does is check for updates. It then scans the local folder where the updates are saved and does some sort of check to make sure everything is good. Processing this folder can take hours on an older machine with a spinning disk.
- The Windows registry, where almost all windows settings and quite often app settings are stored, has terrible performance and can become bloated over the years at as more and more stuff is written to it.
- An SSD can suffer bad write performance if you are dealing with a somewhat full disk and it can’t trim space quick enough to deal with new writes coming in.
- In Linux you’re not supposed to mount with discard any more but instead rely on a cron job that runs in the background once a day. I think various manufacturers are supposed to handle the trim in firmware too but then it’s a black box.
- Filesystems generally perform worse as they fill up. In my experience NTFS seems to have a lower threshold than ext4 or XFS.
- Your disk might be failing. Look at the SMART stats to see if you are hitting unacceptable error thresholds.
- In Linux anything that uses snaps will take forever to start the first time.
Software can decay, hardware can decay, both can exacerbate each other. The registry gets filled up, the filesystem gets fragmented, the software accumulates in memory, upgrades of the same software accumulate hacks and poor design leads to worse fixes to problems, and slowly buggier versions are shipped. The hard drive slowly fails, the SSD loses viable blocks, the fan and heatsink gets clogged with dust and less efficiently cools the system, the battery loses capacity which causes the system to throttle itself, the power supply components slowly degrade causing voltage inconsistency, tin whiskers develop.
The simplest explanation I can give is entropy. Everything in the universe is in a state of decay. Eventually the thing decays so much it breaks down.
The only problem I have is that I can neither update Chrome to a modern version (understandable) nor downgrade it to the version with Adobe Flash support (not understandable). Not supporting Flash is kind of moronic despite of notoriously bugginess of Flash because if you (Google) don't support a browser for me anyway and my browser is buggy a-priori, why do you bother me with disallowing Adobe Flash?
Another vector is uncontrolled synchronicity. E.g. I knew programs that could install an Explorer context menu item that could delay presenting it for two seconds when you right-click. I don’t remember the names, but remember hunting them down and removing from my context menu.
So I think that mostly all of this is because of (1) problematic implementations of caches and (2) plugins that break assumptions on how long something should take.
Your software most likely did not ran off a hard disk. So it was slow to load anyways.. and after that it ran within the memory it had. One program at a time, no memory swapping.
> I don't think it affected PC compatible computers running DOS.
Umm.. I remember being mesmerized by disk defrag programs. There was also TSR programs and other ways to run more as one program at a time.
Not sure why it happens with phones -- I assumed the OSes shipped more expensive animations & apps to encourage people to upgrade.
I've been able to observe, most obviously with my first gen Nexus 7 but also with other devices, that pure CPU benchmarks are still fine, but "disk" benchmarks show terribly low (~100kbps) throughput. Lots of modern software assumes flash/SSD is fast, but inexpensive flash wears out to the point that this assumption fails.
At least once I've seen a thorough wipe very temporarily restore some performance in such a case.
Games can be parallelized, but in a relatively tight way: the game just has to coordinate the physics/AI stuff with the rendering stuff. The rendering is defined in a way that lets it consume most of the resources. Games on old CPUs do hit thresholds where they stop being playable, but they are fairly concrete boundaries where that number of cores and clocks just won't hit the pacing needed for that level of detail.
But the general operating system layer has made a point of pushing both power and responsibility onto userspace applications, and allows them to be helpful and do things in the background. The applications in turn are doing a lot of redundant, overlapping work - no unified methods of serializing and communicating, polling for events, contra-user telemetry and the like - so the end result is a pileup of background load that can kill the experience.
The switch to the SSD era cured one aspect of this - background storage accesses - but just brought on different bottlenecks with memory, core, and network utilization.
The way it might be addressed is to give the OS relatively greater, higher-level responsibility and supersede more of what the applications are doing. The open-source/free-software space has a better chance of accomplishing this because, while it usually develops at a slower pace than a commercial company, it's more "sticky" in terms of accumulation at the base layer and fixing root causes instead of hacking around them. And people generally do report that installing Linux is a speed-up for their old device; when you run a contemporary Linux distro you get a lot of software that has been properly domesticated to the environment, and therefore doesn't turn into a burden.
My mac basically never has it, unless Intellij is running in the background and decides to re-index.
Most software that phones home is designed to work regardless. But the functionality can degrade and performance takes a bit.
Obviously, updates become impossible.
Hardest to identify exactly which one blew unless it is obvious.
Toss motherbord and upgrade it.
That sounds like a faulty hardware issue, or, more likely, a configuration issue.
Once upon a time, the saying was "If you want to speed up a slow machine, throw more RAM at it" as more RAM meant less slow swapping/caching to slow hard-drives.
On a desktop, I think it's more about hygiene and expertise with the OS.
It comes from various electric components, for example electrolytic capacitors dry out.
The reason consoles don't use them, they don't want to ever have the bad reputation of dying consoles.
Why do 'computers' use them? Planned obsolescence. You will go buy that new computer basically by certain number of years.
Defragging freed up space for virtual memory
Because HW gets older. Your access time for RAM will not be the same as it was new, resistors, capacitors, etc will change value due to aging. You semicondoctors will experience migration of ions.
The SW is not "thought" about it and if some operation does not work or does not finish in a period of time, if will mostly crash.
FreeBSD+XFCE for the win.
I thought this was a bug in firefox, but now it's in the chrome based browser too.
I still use an iPhone 6A and it works fine.
Stuff like this is one of the key reasons I dumped Windows.