I would like to load a web page and save it using command line ( want to get a similar behavior that we get for save page as for a complete page in firefox or chrome.)
I tried using wget and httrack, they give me the html files correctly. But in case of a malformed html the browser corrects it while rendering and using save as over there we get the corrected html but this doesnot happen in case of wget or htttrack.
Is there any tool that would render the page and save the page along with all the images and flash and all other stuff in local.
I couldn't find anything else so finally ended up opening the page in firefox and click on the save as button and saving it.. Wrote a script for it using firefox and xdotools to automate the whole task.
Thanks for all the help and views friends.
When I want to save pages for offline use, I use a Firefox plugin called "Scrapbook". That, of course, does not allow for your command line requirement. But if you use a tool like 'htmlunit' or something like that, you can drive the Firefox browser to go to the page you want to save.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With