1. There are a lot of security mitigations in modern Windows that add sometimes substantial runtime overhead, but offer significant protection. Writing an exploit for modern Windows is hard compared to XP SP1 and below.
2. The graphics are a lot more sophisticated. Animation and transparency and lots of good stuff.
3. There are a ton more "features" in the OS. Modern Windows is continually running a bunch of things that XP didn't.
4. All software tends to bloat over time. It's really hard and expensive to optimize things, especially in a huge and long-lived OS where the scope is enormous and PMs are constantly wanting things added. When it runs sufficiently on modern hardware, there's also very little incentive to do so. (As an aside, this is one of the reasons I love Linux. There's lots of incentive to optimize there and thus it does happen).
Overall I really miss XP. Surely nostalgia is a powerful drug, but damn that was a great OS.
Be careful with disabling services. On a different machine, I disabled the Windows Store because I thought I would never use it and then later spent hours debugging why an app I downloaded from the web failed to install with a mysterious error. One thing to try is to turn off/disable/uninstall a service or program each day and log what you're doing in case you need to reverse something that breaks the next day.
And the HN thread: https://news.ycombinator.com/item?id=36503983
And that's how we end up with software that's just fast enough to keep users from seeking alternatives.
One thing that comes to mind Windows XP was the last version without desktop composition. I don’t have hard numbers, but I would guess it adds more latency and requires more GPU memory than the Windows XP rendering system.
1. People expect more functionality, and that requires more processing power
2. Fancier eye candy
3. Spyware, or telemetry, or whatever else you want to call it
4. More background services are running by default
1) Applications that aren't native and especially electron apps or similar. These are terrible slow to start since they are pulling up an entire browser environment to display their interface and they tend to start slowly and often run poorly. There are a lot of them about now.
2) CPU power profiles where the CPU is sat at idle and zooms up to boost speed can take quite a while for some reason and this causes a lot of Windows effects to miss a frame and getting janky. This really depends on your processor and power profiles and core parking settings but its bugged me for years that processor power saving is clearly not just free and one area you notice it is when going from idle to launching apps or doing something UI wise that has a lot of movement.
But generally I don't think Windows is much slower. It uses more RAM and drive space and it really can't be run on anything but an SSD now but power saving consequences combined with inefficient apps is mostly where I notice its not smooth.
XP ran pretty good on a PIII 500MHz / 512MB if you kept an eye on startup bloat via msconfig.exe / autoruns.exe
XP still had major flaws security wise until SP3 came out. The solution was to keep the flaws and try to code around it without touching the foundation.
It's gotten worse because more crap on the PC to capture you as a consumer. That's why they don't call it "My Computer" anymore.
Software stacks got deeper and more resource intensive, and the OS has more features.
The SSD and GPU was fine, so I quickly changed the motherboard to a several-years old low-end dual-core Celeron I had held on to when upgrading someones PC. It didn't have more than 2GB of memory either.
Windows 8 had just been released, so I tried installing that as I had read they had spent some effort trying to make it run better on lower-end hardware.
Surprisingly, with the SSD, I could hardly notice the difference to my old box in regular use. Sure it was a tad more sluggish at certain things, but overall it ran like a champ. With the beefy GPU I could even play a lot of games at decent framerates.
I recall being quite impressed at the time.
They focus on a lot of extra set of tools like the former classic shell that has been forked to make it more maintainable, or the former windows internals tools and their integrations. They also backport a lot of drivers (and there's HUGE bundles of drivers available) to make it run on modern Ryzen hardware.
Ironically, a lot of hackers and ransomware operators use Win7 because it has a much smaller attack surface than the crap that is now Windows 11. Well, that is if they're not using Linux these days anyways.
Fresh install makes sure there is no vendor spyware, like Lenovo superfish.
Then I disable internet stuff. Cloud, cortana, menu start search should not search internet for answers, etc, etc.
After that computer should be much faster.
I presume recall feature will also consume high volumes of energy and pc time.
I'm sure it's performing much better than XP as far as M$ is concerned...
One shouldn't look to M$ for a performant computer in general. The black box nature makes it largely untunable, and it's top priority will always be to provide maximum benefit to M$, not to the user...
You can't compare an OS from 22 years ago to an OS from today.
Even if there was a simple definition of "worse", in this context, which there isn't, they're from completely different worlds designed for different uses on different hardware.
E.g.:
Visual Studio Rant https://www.youtube.com/watch?v=GC-0tCy4P1U
How fast should an unoptimized terminal run? https://www.youtube.com/watch?v=hxM8QmyZXtg
The root cause is that Microsoft simply doesn't have performance as a priority. It's not a KPI for (almost) anyone, certainly nobody in the desktop teams. Upper management doesn't prioritize input latency over telemetry or advertising "features". It's that simple. There's no pressure, no rejection of slow code, no metrics, no pride in good engineering. If it barely functions, it's good enough to ship.[1]
Microsoft employees will occasionally comment to say otherwise, but their statements are contrary to easily verifiable facts, such as massive performance degradations in trivial tools such as the new Notepad, the new Terminal, and the Calculator app.
The actual specific technical reasons include:
- Executables used to be loaded from disk one 4KB page at a time. Execution could start as soon as the first 4KB block was loaded. For a long time now, executables are digitally signed using a hash like SHA1 or SHA256, which means that large executables can't actually start running any code until the entire file is loaded. The obvious fix -- that didn't occur to anyone at Microsoft -- is to use a Merkle tree hash instead of a linear, single-threaded, whole-file hash like SHA256.
- Similarly, anti-malware like Defender will pause execution until the entire file is checked. This is often even slower than the hash check. The caching of this is poor to non-existent in practice.
- The Desktop Window Manager (DWM) performs several layers of buffering and swapping before an update gets to the screen. This has improved in very recent builds of Windows 11 but still has some overhead. An effective end-user workaround is to use a high refresh rate monitor such as the new 240 Hz OLEDs. There's a good online rant about it that can be summarised as "only Apple cares about this": https://danluu.com/input-lag/
- Heavyweight dependencies. Modern GUI toolkits like WinUI drag in half of all source code ever written by the human race if you look at them wrong. The Calculator app was taking 10s of seconds to start for some people (ultrabooks on battery) because it was loading crazy stuff like the "Windows 10 Hello for Business account recovery helper"! Why!? Because nothing is shared any more, every process loads its own dependencies... such as: HTTP client, which requires HTTP proxy support, which requires HTTP proxy auth, which requires Windows Auth, which includes Windows 10 Hello for Business passwordless auth, which requires... account recovery GUIs. For a calculator app. And the terminal. PowerShell. CLI tools in general. Etc... Yes, it's nuts. (Keep in mind each dependency must be loaded in full, hashed, signature checked, and scanned for malware!)
- Telemetry. Every team says they do it efficiently, but only after customer complaints pile up. Some teams never learn, because it's fast for them, either because they know the secret code to turn it off, or because they're 1ms away from the telemetry collection endpoint. Intel for example (a hardware company!) has multiple telemetry tools force-installed with their drivers that spend > 10 minutes(!!) of CPU time daily on my laptop. I have to go and use NTFS ACLs to block execute permissions on those things because it "repairs" itself more aggressively than Bonzi Buddy. NVIDIA injects their telemetry into third-party processes, so about 20% of the startup time of the built-in Calculator App is NVIDIA telemetry. It's not a game! It just happens to use DirectX for graphing... so NVIDIA has to know about it. Etc... Microsoft themselves are the #1 worst offender. At one point, Office + Windows had 220 separate telemetry services or endpoints that I had to track down and disable (for a secure VDI project).
PS: You know it's bad when a vendor starts releasing workarounds as a feature for the mis-features of their own product. The new Windows 11 "Dev Drive"[2] exists only as a workaround for bypassing the massive overheads of Window Defender. You would naively think that simply excluding a folder from Defender would have the same effect. Hah! No. Even the IntelliJ IDEA devs got fooled by this, their IDE has a feature[3] to exclude project folders form real time A/V scanning. This does nothing in Windows 11! All files are always scanned by Defender, even if a third-party anti-virus is installed. This behaviour is the same even on Windows Server, which is one of the main reasons that Azure DevOps build pipelines are something like 4x slower on Windows than on Linux. [4]
[1] To be fair, most software companies are exactly the same. The industry as a whole has an allergy to performance optimisation, and regularly ships billion-dollar products with performance issues in them that would blow your mind. Think of how slow Jira is, or this doozy: https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-times...
[2] https://learn.microsoft.com/en-us/windows/dev-drive/
[3] https://intellij-support.jetbrains.com/hc/en-us/articles/360...
[4] https://github.com/microsoft/azure-pipelines-agent/issues/44...
Every time I need a new laptop I remove a considerable number of services and programs, I change when possible even the windows manager, I also change lot of default configurations that are useless and /or waste resources, cpu, battery, disk, network.
Also, that is useful for security reasons reducing the attack surface.
Taken individually software I remove seem harmless but, in reality, they are like zombies in large number they can eat you alive.
For example: voice control, this to explain the concept, I never use it, especially on my professional machine. Do you imagine people in an open space office, screaming orders to their computers ? But maybe a disabled person can depend on that feature so it's good to have the possibility to enable it but do you think it's acceptable that this is installed by default on your phone / computer / whatever ? "Yes, but it use only 0.6 % of cpu in standby, you are exaggerating !" Yes of course, it seems harmless ALONE but, of the hundreds of process now in execution on your pc how many are doing tasks useful for you ? On the other hand, how many are there because are installed by default, just in case you are in a small user of groups for that application ?
I observed systemd-oom on my machine using a fix 0.5% of CPU, I have 64GB of RAM and 4TB of disk, I oversized explicitly that computer for my jobs. Why I should need this daemon ? Oh, anyway, do you know how work con AIX ? you can reserve some RAM for emergency shell so, in case of OOM, you can take actions without a stale. Again, "You are exaggerating, only 0.5% of CPU !!!". Yes, alone, but in this moment I have 560 process in execution on my pc: if 100 are bloatware, 50% of one core is gone being optimistic with related net, RAM, disk, battery, etc. Again, if someone find it useful, good for him, good to have the possibility to install it.
I would like to have a real minimal installation mode, where I can select explicitly what I want to install, in detail, plus, for everybody else a "premium bloat inflated mode" the install utility like scratch-my-back-d and every "just in case whatever" one could desire.
The number of that "inutilities" is increasing because someone think are required from a MODERN USER, but in reality, if it was still upgraded / updated, I could do my job with that very OS of 20 years ago, but at speed of light.
I remember that every time I use a VM with legacy software that's so speed I can't believe it.
We have start menu that search on Internet software to suggest you for installation, advertisement everywhere, telemetries , statistics, convoluted GUI with animation, effects, bullshit that don't add useful features. Simply they are here "just in case".
And about the programs ? Even worst: layer and layer of usefulness coupled with MODERN programming tools, paradigms, language that in the facts kill the performances.
About the cloud ? It's inserted in every possible useless / annoying way in functionalities that in no way take advantage from that.
Web app ? I'm sure that 10 years ago browsing was faster ! Same reasons.
The sad story is that at the end, in case of commercial OSs like Windows / MacOS you are paying for software, for hardware, for electricity that execute tasks you don't want and you don't need and, with few exceptions, the same jobs you was doing 20 years ago with XP are now slower. Linux ? Major distribution are becoming in indistinguishable from commercial OS, at least you don't pay for the software.
I'm listening more and more people saying: "I want to migrate to OpenBSD". Don't you ?
"Eh, but now you have wifi, ultra-mega-xd VGA, etc, etc".
Of course and also I have a 20 core processor, 64GB of RAM and a monster VGA with 8GB of RAM, 2 high-end NVME and still I had a 1/2 second lag after every key I press in my terminal, before I removed tons of bloatware from my laptop, full story here:
https://news.ycombinator.com/item?id=40110342
so at the end the status of modern OSs piss me off twice. Try to search on Google "Ubuntu 22 slow on modern hardware" or something like that.
I'm listening more and more people saying: "I want to migrate to OpenBSD". Don't you ?