Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do you archive an entire website for offline viewing?

We actually have burned static/archived copies of our asp.net websites for customers many times. We have used WebZip until now but we have had endless problems with crashes, downloaded pages not being re-linked correctly, etc.

We basically need an application that crawls and downloads static copies of everything on our asp.net website (pages, images, documents, css, etc) and then processes the downloaded pages so that they can be browsed locally without an internet connection (get rid of absolute urls in links, etc). The more idiot proof the better. This seems like a pretty common and (relatively) simple process but I have tried a few other applications and have been really unimpressed

Does anyone have archive software they would recommend? Does anyone have a really simple process they would share?

like image 344
jskunkle Avatar asked Feb 11 '09 21:02

jskunkle


People also ask

Is it possible to save a whole website for offline browsing?

You can save webpages to read later, even if you're offline, like when you're on an airplane or somewhere else without an Internet connection. To read webpages later offline, download them in Chrome ahead of time.

Can you archive a website?

There are several ways to archive a website. A single webpage can simply be saved to your hard drive, free online archive tools such as HTTrack and the Wayback Machine can be used, or you can depend on a CMS backup. But the best way to capture a site is to use an automated archiving solution that captures every change.


1 Answers

You could use wget:

wget -m -k -K -E http://url/of/web/site 
like image 165
chuckg Avatar answered Sep 22 '22 08:09

chuckg