Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to download entire front-end of a website [closed]

Tags:

html

web

I want to download all the html, css and js files of the entire website in one click. I tried right-click and view source code but then I have to copy paste each page and create folder myself so its very tedious. Is there any open source software that helps do that or do i have to code it myself?

like image 760
Shreyans Avatar asked Sep 23 '15 22:09

Shreyans


1 Answers

wget is your friend here and it works on windows, mac and linux.

wget -r -np -k http://yourtarget.com/even/path/down/if/you/need/it/

-r is recursive
-np (do not follow links to parent directories)
-k to make links in downloaded HTML or CSS point to local files

Other useful options:

-nd (no directories): download all files to the current directory
-e robots.off: ignore robots.txt files, do not download robots.txt files
-A png,jpg: accept only files with the extensions png or jpg
-m (mirror): -r --timestamping --level inf --no-remove-listing
like image 97
Shawn Mehan Avatar answered Sep 20 '22 22:09

Shawn Mehan