HACKER Q&A
📣 PaulHoule

Why do computers/tablets/etc "freeze up" as they get older?


By "computer" I mean: desktop computer, phone, tablet, VR headset. I'm not so sure that the problem happens with game consoles -- but it also doesn't seem to affect PC games. (The machine annoying me the most right now is an 8th generation iPad.)

So far as I can tell the problem started around the time Win95 came out and also affected MacOS Classic at around the same time. It did not seem to affect minicomputers like the PDP-8, PDP-11 and VAX. It did not seem to affect home computers like the Apple ][, Commodore 64, and such. I don't think it affected PC compatible computers running DOS. I don't think it affected Sun or AIX workstations or Linux machines running X Windows until KDE and Gnome came along, now it does.

(I think it happened around the time that most GUI systems started to incorporate distributed object models and RPC mechanisms like COM, XPC services, DBus, Binder, etc. Certainly all the systems that annoy me like this have something like that.)

Reinstalling the OS seems to help for a short time, but as time goes by the length of the honeymoon seems to get shorter and it is faster to get slow again.

Back in the day it helped to defragment hard drives but we're told we don't need to do this in the SSD era.

If I had to describe it, it is not that the machine gets "slower", but instead it has periods of being unresponsive that become longer and more frequent. If it was a person or an animal I'd imagine that it was inattentive, distracted, or paying attention to something else.

If you benchmarked in the ordinary way you might not see the problem because it's a disease of interactive performance: a benchmark might report 10 seconds on either an old or a new machine: it might take 0.2 sec for the benchmark to launch on a new machine and 2.0 sec on the old machine, but the benchmark starts the clock after this. You might need to take a video of the screen + input devices to really capture your experience.

Any thoughts?


  👤 Pingk Accepted Answer ✓
The average developer doesn't put the same care into what they're developing the way they had to in the past.

Computers were slow in the 80s/90s, so if you wanted your program to run fast, you had to know how cache and RAM worked, and utilise what little power was there to the max. As computers have gotten faster, that knowledge has gotten less relevant to the average developer.

There are some fields where that knowledge is useful, like embedded systems, HPC or games, but they are a minority.

I won't deny there are some benefits in not needing to care - you can write programs faster (even if they perform slower), and some classes of bugs can be reduced (e.g. memory bugs). These are good things, but fundamentally our programs still move memory around and interface with the real world.

As more people forget (or aren't taught) to keep this in mind, we'll lose the ability to make things fast when we need to, and our software will get slower and slower until it's unusable.

On a related note, we're also not allergic to complexity the way we should be. The average developer doesn't have a problem using hundreds of libraries in a single project if it makes their job easier. Unfortunately this also makes programs slower, harder to debug, and increases the surface area for a vulnerability.


👤 BoppreH
Sounds like excessive paging, aka Thrashing[1]. Newer software uses more memory, and when the OS has no more physical memory to allocate, it moves some data from RAM to disk and back until the process frees those pages. This destroys performance, and the more memory is over allocated, the longer it takes to finish the operation and go back to normal.

Differently from high CPU usage or network problems, this can potentially affect any memory operation, so even carefully designed GUIs will freeze.

This effect will not happen in systems with sufficient RAM, or that don't swap memory to disk. In that case it'll either crash the program trying to allocate, or a random process[2].

[1] https://en.wikipedia.org/wiki/Thrashing_(computer_science)

[2] https://en.wikipedia.org/wiki/Out_of_memory


👤 kube-system
There's a few things we have on newer computers that we didn't have on older computers:

* regular software updates -- more features and larger codebases running on the same machine

* larger quantities of software running at the same time -- on older machines multitasking either wasn't as common or wasn't even possible, and as time went on more software became available as did the ability to run more of it at the same time.

* the internet -- pages on the internet get larger and more demanding all of the time, while the capability of any given piece of hardware stays constant.

* active cooling and thermal throttling -- hardware can actually run slower if it gets dirty


👤 teeray
I've always attributed it to the fact that developers typically have really good hardware all the time. When you develop on a great machine, you don't give the performance of older machines as much consideration. Software now typically has a mandatory upgrade path, so even if there is a version that works well with your hardware, you may not be allowed to use it.

👤 kazinator
Mostly, it's the software upgrades. Programs almost always get more bloated with each up update: bigger executable sizes, more features that need more cruft initialized on startup and such.

Silicon aging contributes some: https://en.wikipedia.org/wiki/Transistor_aging

This wouldn't have affected your Apple II and PDP-8's, due to the huge transistors in those things (relatively to modern integration scales), and not running at anywhere the modern clock speeds.


👤 a-dub
1) software grows and evolves in complexity over time while hardware remains constant until it is upgraded, this explains the macro or low frequency component of the slowdowns (ie, things that reinstalling the o/s won't totally solve)

2) local state grows and evolves in complexity. configuration databases grow, new files are added to the filesystem, software upgrades leave unused state around (because getting software upgrades right is still a not-totally-solved problem), indexes (or similar) of mutable state grow but sometimes don't get rebuilt causing locality to suffer affecting cache performance.

3) psychology.


👤 wegfawefgawefg
Apple has admitted to reducing clock speeds of devices during updates to increase battery life.

As for android, I have experienced this a million times also, but do not know the cause. But I do usually see an accumulation of new and stupid bugs as I get software updates. Incorrect app scaling and positioning, etc which were not the case for the first six months I had the phone.


👤 RyanShook
One aspect sometimes overlooked is that the physical device degrades over time. Blocks of memory corrupt, connections erode, humidity or dust slows down cooling mechanisms. We tend to think of computers as either working or not but they can and do break down over time.

👤 montjoy
There are a few things going on that I’m aware of. It’s usually related to disk.

- Windows update can take a lot of blame for this. Basically the longer the Windows release has been out, the more windows updates there are. After every boot one of the first things windows does is check for updates. It then scans the local folder where the updates are saved and does some sort of check to make sure everything is good. Processing this folder can take hours on an older machine with a spinning disk.

- The Windows registry, where almost all windows settings and quite often app settings are stored, has terrible performance and can become bloated over the years at as more and more stuff is written to it.

- An SSD can suffer bad write performance if you are dealing with a somewhat full disk and it can’t trim space quick enough to deal with new writes coming in.

- In Linux you’re not supposed to mount with discard any more but instead rely on a cron job that runs in the background once a day. I think various manufacturers are supposed to handle the trim in firmware too but then it’s a black box.

- Filesystems generally perform worse as they fill up. In my experience NTFS seems to have a lower threshold than ext4 or XFS.

- Your disk might be failing. Look at the SMART stats to see if you are hitting unacceptable error thresholds.

- In Linux anything that uses snaps will take forever to start the first time.


👤 throwawaaarrgh
There are a lot of different factors at play. What you're seeing looks similar but has many different causes in different cases, like having a runny nose. Lots of different things cause it, same symptom.

Software can decay, hardware can decay, both can exacerbate each other. The registry gets filled up, the filesystem gets fragmented, the software accumulates in memory, upgrades of the same software accumulate hacks and poor design leads to worse fixes to problems, and slowly buggier versions are shipped. The hard drive slowly fails, the SSD loses viable blocks, the fan and heatsink gets clogged with dust and less efficiently cools the system, the battery loses capacity which causes the system to throttle itself, the power supply components slowly degrade causing voltage inconsistency, tin whiskers develop.

The simplest explanation I can give is entropy. Everything in the universe is in a state of decay. Eventually the thing decays so much it breaks down.


👤 AngryData
I haven't experienced this for nearly 15 years now. If something starts acting up I always find there is either hardware going bad that will completely fail before too long, a bunch of bloatware running in the background, a drive is completely full and there isn't enough room for page file, or it has uptime measuring in weeks or months.

👤 eimrine
I use a lot of 10-20 years old computers and never experiencing those. Right now I am writing from single-core P4 and 2Gb RAM, of course it is 32-bit. I still can open any website from HN except of those requiring me to have a top-notch web-browser.

The only problem I have is that I can neither update Chrome to a modern version (understandable) nor downgrade it to the version with Adobe Flash support (not understandable). Not supporting Flash is kind of moronic despite of notoriously bugginess of Flash because if you (Google) don't support a browser for me anyway and my browser is buggy a-priori, why do you bother me with disallowing Adobe Flash?


👤 wruza
For a program to freeze, it must do i/o, usually. E.g. my image viewer has a convenient thumbnail cache which speeds up things. But when I start it in a folder of many folders of many images, it freezes for several seconds in i/o. Sadly it has features that stop me from migrating away.

Another vector is uncontrolled synchronicity. E.g. I knew programs that could install an Explorer context menu item that could delay presenting it for two seconds when you right-click. I don’t remember the names, but remember hunting them down and removing from my context menu.

So I think that mostly all of this is because of (1) problematic implementations of caches and (2) plugins that break assumptions on how long something should take.


👤 wila
> It did not seem to affect home computers like the Apple ][, Commodore 64, and such.

Your software most likely did not ran off a hard disk. So it was slow to load anyways.. and after that it ran within the memory it had. One program at a time, no memory swapping.

> I don't think it affected PC compatible computers running DOS.

Umm.. I remember being mesmerized by disk defrag programs. There was also TSR programs and other ways to run more as one program at a time.


👤 fomine3
HDDs wear out. Cheap eMMCs wear out too. I believe these were big reason for performance degradation on computer. With decent SSDs, it shouldn't be happen (on computer's lifetime) but some computer performance still degrades a bit, so there should be other reasons too, like bloating Chromium.

👤 qiqitori
This is obviously not the case in general. There are a few things that may be correlated with the age of a computer though -- full SSDs are often slower; newer programs/sites require more memory and CPU; thermal throttling due to dried up thermal paste or dust.

👤 verteu
I've had laptops slow down like this due to dust on the fans. They weren't able to cool effectively, and presumably the CPU got throttled.

Not sure why it happens with phones -- I assumed the OSes shipped more expensive animations & apps to encourage people to upgrade.


👤 arantius
In at least one case (I've most clearly observed this in cheapish Android tablets): inexpensive flash storage.

I've been able to observe, most obviously with my first gen Nexus 7 but also with other devices, that pure CPU benchmarks are still fine, but "disk" benchmarks show terribly low (~100kbps) throughput. Lots of modern software assumes flash/SSD is fast, but inexpensive flash wears out to the point that this assumption fails.

At least once I've seen a thorough wipe very temporarily restore some performance in such a case.


👤 atomicnumber3
FWIW, I don't personally think this affects most Linux distros. My Arch Linux laptop from 2014 stills runs just as well as it did on day 1. The only reason I don't use it is because I wanted to use a gen4 ssd (at gen4 speeds). I verified this when I revived it recently to run a win95 emulator so my son could play an old PC game.

👤 crq-yml
I believe it's attributable to "wider" vs "taller" software development. As in, software using more cores, running more background processes, intermittently accessing I/O more frequently, vs software that does a single task more intensively like a game.

Games can be parallelized, but in a relatively tight way: the game just has to coordinate the physics/AI stuff with the rendering stuff. The rendering is defined in a way that lets it consume most of the resources. Games on old CPUs do hit thresholds where they stop being playable, but they are fairly concrete boundaries where that number of cores and clocks just won't hit the pacing needed for that level of detail.

But the general operating system layer has made a point of pushing both power and responsibility onto userspace applications, and allows them to be helpful and do things in the background. The applications in turn are doing a lot of redundant, overlapping work - no unified methods of serializing and communicating, polling for events, contra-user telemetry and the like - so the end result is a pileup of background load that can kill the experience.

The switch to the SSD era cured one aspect of this - background storage accesses - but just brought on different bottlenecks with memory, core, and network utilization.

The way it might be addressed is to give the OS relatively greater, higher-level responsibility and supersede more of what the applications are doing. The open-source/free-software space has a better chance of accomplishing this because, while it usually develops at a slower pace than a commercial company, it's more "sticky" in terms of accumulation at the base layer and fixing root causes instead of hacking around them. And people generally do report that installing Linux is a speed-up for their old device; when you run a contemporary Linux distro you get a lot of software that has been properly domesticated to the environment, and therefore doesn't turn into a burden.


👤 yawnxyz
my old Compaq w/ Windows 95 boots up like a charm, and I can still play games like X-Com w/o a problem. My early generation iPad can barely run the home screen.

👤 Aeolun
My iPhone does this, and I have no idea why. It’ll just randomly have huge issues switching from one program to another. It doesn’t have an activity indicator on individual programs, so I never know what the issue is.

My mac basically never has it, unless Intellij is running in the background and decides to re-index.


👤 austinjp
Has anyone had any luck breathing life into old tablets with PostmarketOS or anything similar? I'm talking about Android tablets that are now so slow as to be literally unusable. Some comments here mention hardware degradation, so can a lightweight alternative OS/distro make a difference?

👤 pyuser583
A reason not mentioned so far: lots of software “phones home.” If home doesn’t exist, or network changes take place, the software when can confused.

Most software that phones home is designed to work regardless. But the functionality can degrade and performance takes a bit.

Obviously, updates become impossible.


👤 egberts1
Capacitors wears out.

Hardest to identify exactly which one blew unless it is obvious.

Toss motherbord and upgrade it.


👤 jimjimjim
Those operating systems have non-kernel long running software and more and more long running UI software. Heap fragmentation and memory garbage collection seem like probable causes.

👤 simonblack
They don't. Yes they might get slower as newer software imposes higher demands on less-capable hardware. But they shouldn't freeze.

That sounds like a faulty hardware issue, or, more likely, a configuration issue.

Once upon a time, the saying was "If you want to speed up a slow machine, throw more RAM at it" as more RAM meant less slow swapping/caching to slow hard-drives.


👤 gdulli
I had an iPhone and Google tablet that both had this problem. I stopped buying hardware from either. I switched to Galaxy phones and haven't had a problem since. My current S20 is 3 years old and runs like the day I got it. My previous galaxy phone still ran perfectly well after 4 years. I have no idea what accounts for it.

On a desktop, I think it's more about hygiene and expertise with the OS.


👤 incomingpain
It does happen with consoles but much later. This lets you understand why.

It comes from various electric components, for example electrolytic capacitors dry out.

The reason consoles don't use them, they don't want to ever have the bad reputation of dying consoles.

Why do 'computers' use them? Planned obsolescence. You will go buy that new computer basically by certain number of years.


👤 gumballindie
Some manufacturers, such as Apple, have been intentionally making devices slower to drive sales of new products. Potentially Microsoft does the same with Windows. For instance, they recently announced that computers lacking a certain hardware component will not be able to run new versions of the OS, making hundreds of millions of perfectly running computers obsolete.

👤 userbinator
I personally haven't seen this effect on my machines, but then again, I don't change my configuration much and keep a constant watch over resource consumption. I suspect it occurs to those who will gradually install and run lots of applications over time, and leave them open, without realising the resources they're use.

👤 perryizgr8
I had a nexus 7 tablet that ran like a dream when it was new. Within a year it was unusable slow. That's because google had shipper it with a very low quality SSD. It degraded quickly, resulting in slower and slower speeds. So at least thats one way computers could get slower or keep locking up after some time.

👤 kraig911
I feel so much of speed is perception. When I first got the internet I have no problem waiting 3-5 mins for webpages to load. Now I expect 3-5ms. The current context of your experience of speed might influence your expectations when using older equipment.

👤 system2
Batteryless devices are superior to whatever we had in the 90s. Blue screen of death was a daily occurrence in the past. I don't remember having any device issues in the last 10 years. Devices with batteries are having issues with voltage drops.

👤 Grimblewald
If the system is otherwise stable, then the first thing that springs to mind for me is storage going bad. Bad sector reads/writes will cause wait times while errors are corrected or mitigated etc but otherwise cause no real issues.

👤 mortallywounded
I only experience "slow down/freeze up" with Windows and to a lesser extent, Mac based operating systems. I have never felt that way with a Linux install.

👤 slantaclaus
Sounds like a Windows thing in my experience

👤 hammock
Memory overflows.

Defragging freed up space for virtual memory


👤 hulitu
> Ask HN: Why do computers/tablets/etc "freeze up" as they get older?

Because HW gets older. Your access time for RAM will not be the same as it was new, resistors, capacitors, etc will change value due to aging. You semicondoctors will experience migration of ions.

The SW is not "thought" about it and if some operation does not work or does not finish in a period of time, if will mostly crash.


👤 Gud
Doesn't happen to my computer, which runs FreeBSD and is fairly old by now. X1 Carbon Gen 6.

FreeBSD+XFCE for the win.


👤 stuaxo
My Android phone is doing this now.

I thought this was a bug in firefox, but now it's in the chrome based browser too.


👤 ralph84
Your iPad has flash memory and a battery. Both are consumable items that wear out over time.

👤 0172
"Wirth's law is an adage on computer performance which states that software is getting slower more rapidly than hardware is becoming faster."

https://en.wikipedia.org/wiki/Wirth%27s_law



👤 andrewstuart
Doesn't happen with my Mac. Just make sure you have minimum 16GB.

I still use an iPhone 6A and it works fine.

Stuff like this is one of the key reasons I dumped Windows.


👤 ijhuygft776
broken fan -> overheating, sometimes, maybe

👤 grigio
check the ram usage