Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Scrape and convert website into HTML? [closed]

I haven't done this in 3 or 4 years, but a client wants to downgrade their dynamic website into static HTML.

Are there any free tools out there to crawl a domain and generate working HTML files to make this quick and painless?

Edit: it is a Coldfusion website, if that matters.

like image 641
Kevin Avatar asked Aug 12 '10 15:08

Kevin


2 Answers

Getleft is a nice Windows client that can do this. It is very configurable and reliable.

Wget can, too, with the --mirror option.

like image 76
Pekka Avatar answered Nov 17 '22 02:11

Pekka


Try using httrack (or webhttrack/winhttrack, if you want a GUI) to spider the web site. It's free, fast, and reliable. It's also much more powerful than primitive downloaders like wget; httrack is designed for mirroring web sites.

Be aware that converting a dynamic page to static will lose you a lot of functionality. It's also not always possible - a dynamic site can present an infinite number of different static pages.

like image 9
Borealid Avatar answered Nov 17 '22 02:11

Borealid