HACKER Q&A
📣 behnamoh

Why saving webpages on hard disk has not got better?


I use Firefox and every time I try to save a webpage, the first try fails! The second one works, but then the saved file does not look like the one you'd see online. There are workarounds such as *.mhtml extensions, but I wonder why browsers have not got better at saving webpages locally.

I've used Chrome too, same problem.


  👤 namaljayathunga Accepted Answer ✓
I use this to backup pages automatically

https://github.com/i5ik/22120


👤 asaddhamani
Use SingleFile. It inlines any CSS and images and your saved page will look pretty close to the actual page.

👤 account-5
I tend to use singlefile extension.

I am looking for something that saves just the JSON responses from a website. I can see the JSON in devtools network tab but can't extract it. I'm not a webdev so it may be easy I just don't know how it's done.


👤 S4M
I use wget (on the command line) to save a webpage. It has an option to save recursively all the links on the page as well, or only the links from a certain domain.

👤 ggm
The odd thing is, either in shmem/mmap, or on disk cache, is most of the fetched state. A simple "tee" of the data out of QUIC would replay the load of the disparate mark-up elements.

It's all there. The problem is deciding how to add code to save it in a rational reloadable manner.

Compared to eg ZFS snapshot: low cost copy on write, saved state. (Admittedly of data on disk, which is the goal here, but then the browser cache is on disk)


👤 spsphulse
I'm happy user of Chrome extension called savePageWe

👤 Nikhiil_Rajesh
Try using the [Singlefile](https://chrome.google.com/webstore/detail/singlefile/mpiodij...) browser extension. This is what I typically use and works very well.