Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to download a subdomain of a website competely in linux with wget or some other tools?

I want to download all the passages of http://source.yeeyan.org. It has a lot of pages. E.g. http://source.yeeyan.org/?page=22202 So how to use wget or some other tools in linux to download them down? Currently, i use the following parameters, but it does not work.

wget --recursive --no-clobber --page-requisites --html-extension --convert-links --restrict-file-names=windows --domains yeeyan.org --no-parent source.yeeyan.org

like image 594
luyi0619 Avatar asked Oct 08 '22 17:10

luyi0619


1 Answers

As an alternative to wget there is also httrack whose sole purpose is to copy websites and thus may suit you better. It also has a GUI.

like image 180
scai Avatar answered Oct 12 '22 20:10

scai