They found that for 1.1% of visitors the javascript didn't run.
0.9% was because of errors — network errors, browser-extensions breaking the code, a mobile user going through a tunnel.
0.2% was people who actively blocked javascript.
I'm writing an small article about making sure websites work for that 0.2%.
1. https://gds.blog.gov.uk/2013/10/21/how-many-people-are-missing-out-on-javascript-enhancement/
The friction in surfing doesn't get lower over time, not least because my exposure to most non-whitelisted websites are through websites like HN, but it's also not high enough that I would consider not blocking JavaScript by default.
Websites that use a lot of third-party JavaScript are red flags to me; more often than not, they are unrelated to critical functionalities (such as payment), and they add absolutely no value to the client.
Perhaps one thing that is worth noting is that modern frameworks that are SSG-focused or hybrids like Next.js make developing websites for JavaScript-blocking clients a bit easier. I haven't personally looked into this too closely, but at least with the website projects that I have work on with Next.js, you would still get a presentable bare-minimum with little effort when JavaScript is turned off, it's also not difficult to implement a reasonable fallback for JavaScript-blocking clients.
The above of course depends on how much server data need to be fetched dynamically to render a page, which again depends on the nature of the content on a website. If I'm not mistaken, there are functionalities that simply can't be implemented, or very impractical to implement, without JavaScript; in those cases the question of "making it work for that 0.2%" is probably moot.
However, it is sad how few pages work without javascript these days. Nevertheless, I very rarely enable it for any site. Instead, if there is a page I really want to use, I "fix" it using a userscript. Sometimes it's as simple as modifying the CSS to reveal the hidden content. Sometimes it involves parsing some inline json from the document, doing XHR to get the content and building the html from scratch to show it. I suspect I'm in the minority even amongst the 0.2%.
* The gain of speed is noticeable. I love it.
* The benefits of security is qualitative/theoretical. I do not have any quantifiable data here.
* The change of UX is usually acceptable. I consume mostly textual information.
A few website's contents and/or services are valuable enough so that I whitelist them. For example, GitHub, YouTube, and HN.
I am not saying it is not worth the effort, but I am saying you should clearly set expectations and measure to be sure that this juice is worth the squeeze. If your motivation is because it is the "right thing" to do and not because you can quantify that the effort is worth it, then it almost certainly is not. On the other hand, your business analysis might show that this type of client is 10x more likely to subscribe/buy your product and supporting them is a no brainer.
Note, that I am assuming we are talking about a smaller client base and dev team. Social media size sites might for example consider this 0.2% of their perspective userbase as a critical mass and devote thousands of hours to supporting them. You might also feel in your case that the use of JavaScript hinders assistive technologies and their support is critical to your success.
For what it is worth, I do not disable Javascript myself, nor do I build sites that specifically work without it though I do try to build clean HTML to the extent it is practical and I would rather solve a problem with CSS than with Javascript.
I don’t think users who bounce because of huge JS bundles are tracked very well, because often the bundle that hasn’t loaded is what does the tracking.
Most of the browsing I do in this profile is reading articles or visiting sites that are mostly text and images. The speed gains are enormous as is the improvement in performance on my lower powered laptop. I just don't see the need to enable JS unless the site breaks. If the site breaks it is either poorly written, in which case I have the option to bounce, or uses a JS framework like React or whatever in which case I allow at most first-party JS. If it works it works.
I also disable fonts; I like to specify my own fonts instead. I also disable favicons, since it is not a feature I use, and third-party cookies, and some other features.
Sometimes I disable CSS too, and sometimes this compensates for disabling JavaScripts. Sometimes I also use user scripts and user CSS.
But here is an unexpected side effect I noticed: Sites which require JS just to display correlate highly with low-quality content, which I usually regret wasting my time on having read.
This, above all the other reasons, is why I don't bother. Sure, I can unblock it with a quick three-key sequence, or I can use alternative access methods, but usually I just thank them for saving me the time and move on.
Also some portion of those 0.9% might be running librejs or noscript which has the user review each script file before running it and is much more aggressive about blocking cross site requests. With "disable js" removed from the firefox GUI a lot of the people "blocking javascript" probably show up as "errors."
I use this. The main reason is performance, my thinkpad is unusable if I let everyone's crap run wild on it. Also evaling every snippet of code you get over the network is absolute brain damage from a security perspective but no one actually cares anymore.
abusive script, dark patterning and subversion of autonomy are major issues.
im hunching that .2% is a portion of machines that are in such a security stance that you have little hope of getting any of them to convert to allowing JS
I block JS because the web is so much faster without it, especially text articles.
Site by site, I block javascript when it is in my way. Wikipedia for example. There is no alignment between my reasons for going to Wikipedia and the various reasons Wikipedia uses Javascript. Lazy pseudo-paywalls are another use case. Obnoxious "this site uses cookies" also.