HACKER Q&A
📣 Justin_K

Why is desktop computing still so slow?


I don't understand why using a desktop is still so painfully slow. On moderate to better hardware 10 years ago it took ~5-10 seconds for me to open photoshop... today it's still the same. Windows took 10-30 seconds to load then and it's about the same now (to go from power button to actually in Windows with the desktop loaded). With the hardware being multiple times faster, why doesn't software follow suit? If you asked me 10 years ago, I would have been most looking forward to things loading in an instant, but we appear to be no better today.


  👤 juststeve Accepted Answer ✓
>With the hardware being multiple times faster, why doesn't software follow suit?

Because software has to be written in a way that leverages the new hardware. The optimization is not automatic. If you haven't read this[1] then it may provide some insight.

[1] https://lwn.net/Articles/250967/ (see parts 5 and 6)


👤 throwaheyy
Talking about specifically boot? Once all other bottlenecks (such as mechanical HDDs or insufficient RAM) are eliminated?

Plug and play. Enumerating buses for hardware, and loading drivers for it all. Starting off in backward-compatible hardware modes (eg. a BIOS using PIO in the case of storage devices) and progressively enabling faster and more complicated modes (e.g. DMA or AHCI as drivers get loaded).

The same mechanisms that allow you to plug in hardware to a desktop and have it “just work” make your computer slower to start up than it could theoretically be. There’s a trade-off between that configurability and startup performance. Embedded systems can start up faster because addresses of its peripherals, resources, accessories etc. are computed ahead of time and hardcoded. It doesn’t have to dynamically rediscover them every time it starts.

Extend the “plug and play” concept to software — modular software packages, frameworks, reusable components, dynamic plugins. That’s why, once the OS is loaded, software is slow to start up.

The configurability/performance trade off is true at every level of computing, even down to the function call - https://stackoverflow.com/questions/4667882/is-a-statically-...


👤 lucas_membrane
Page faults are the very slow thing. When you boot, your cache is gonna start empty. Maybe it will run faster after it has been running a while. If you have a pretty stable linux machine, you might only re-boot once or twice in a pandemic.

This situation may eventually be improved by some more secure run-time environment(?s) that somehow eliminate the need for hardware to mother hen so many memory accesses, but we might not be so happy to work under such a regime either.


👤 peruvian
Because for every inch of computer power get devs get, we take a mile in unoptimized code, frameworks, etc. for the 'developer experience'.

👤 the__alchemist
See Wirth' Law: https://en.wikipedia.org/wiki/Wirth%27s_law : "Wirth's law is an adage on computer performance which states that software is getting slower more rapidly than hardware is becoming faster."

In work environments, it can be even worse. It takes me (at work) 5 minutes to log in; 2 minutes to open a MS office app or email; 5-10s to save a document; 20s to insert a row in Excel. Please send help.

The answer to your question depends: For office settings like I described, it may be due to forced network requests, and surveillance/compliance software. More broadly, it's unoptimized software, and various pieces of software competing for resources and scheduling. A mirror that may be easier to GROK is the web: Web apps can be small and responsive by using targeted DOM-manipulation. Most web apps use layers of frameworks, cookies, trackers etc that make them take up much more space, and make them less responsive.

In order to fix this, we may need a new generation of operating systems and software, built from scratch, with a focus on performance (ie responsiveness).

This is one of the reasons I've gotten so excited about embedded and low-level programming: You can ditch the OS, and make programs that respond in a way that feels instantaneous.


👤 kazinator
Software can easily grow to eat available RAM, and to swamp your disk I/O.

SSD's are helping with this in one way, but also just making it worse by fueling it.

64 bit has also opened a gate toward bloatware: no more barrier in the 3Gb - 4Gb range. Plus 64 bit directly contributes to bloat: now every pointer takes up twice as much space. Any data structure that contains nothing but pointers has doubled in size. If you previously had 100 MB worth of that structure in RAM, now it is 200M without any change in the program's source code.


👤 uberman
I have to wonder, how long would it take for your desktop of today to open photoshop from 10 years ago?

Perhaps software bloat happens in pace with hardware capabilities.


👤 ianai
Stuff's loading a lot faster and seems dramatically more responsive on my M1 mac. It feels like the most significant advancement in a long time. They're built with on-die ARC memory management and it's coded through the OS. It seems to have paid off greatly.

Otherwise, it's not like software's written to be performant and always use 100% of all cores. That'd require you the user to be doing a lot and probably some dramatically re-engineered software up/down the stack.


👤 aakkaarr
Have you bought a new laptop recently? Because I just switched from 7 years old decent laptop to a new decent in December. And I have completely different feelings than you: I didn't know my software stack can be even faster than it was but it is!

System load was 10 seconds and is 10 seconds again (Linux here) but for example starting phpstorm is like 2-3 times faster.

I had a few GB big project which was scanned for 30 minutes when loaded first time in phpstorm. Now it takes max 10 minutes.


👤 MichaelRazum
Maybe the developers just don't care that much.

We created once an electron based app, on a very simple use case. It was extremely slow since some bad dev decisions as well since nobody focused on performance. Later optimized it a bit and it was usable. In the end it worked. Features and good look were just ways more important.


👤 trinovantes
Upgrading to NVMe drive was probably the best purchase I've made last year

Sadly even with an NVMe boot drive and it still takes me 11s to boot Windows because I got multiple Electron apps: Spotify, Docker Desktop, and Slack


👤 ohiovr
Linux has preload which makes applications launch faster the second time they are run. My system runs on a thumb drive and it takes 11 seconds to launch krita. The second time takes 3 seconds.

👤 yuppie_scum
There was that study about how after the invention of the vacuum cleaner, the amount of time domestic housewives spent cleaning actually went UP!

👤 massysett
You’re using the wrong software. People used to complain about Emacs taking forever to open and being dreadfully slow. Most of Emacs runs instantly on my low-spec, years-old Thinkpad.

Seriously: if you use simple software, it’s fast. My Mac is over five years old and everything I do with it runs fast, and that’s with my Mac OS release being only one version behind the current one.


👤 pid_0
Because of the crazy power hardware has now, developers have no incentives to optimize code.

Modern code SUCKS.


👤 zzo38computer
Probably one reason is because slower programs are written. Sometimes there are actual improvements in the program, but often it is just slower without real improvements.

👤 daniel-s
Linux/LXQT boots to a ready desktop in about 10 seconds. Not everything is slow.

👤 christian0cfg
Very soon you won't use barely any desktop software nor will your coding depend on your hardware as much as now. Many companies are working on that and the time seems much better than a few years back. I am curious to see how some startups in this space will reshape the world.

👤 ilrwbwrkhv
Electron is the culprit

👤 techsin101
also input latency is worse