I want to download all the passages of http://source.yeeyan.org. It has a lot of pages. E.g. http://source.yeeyan.org/?page=22202 So how to use wget or some other tools in linux to download them down? Currently, i use the following parameters, but it does not work.
wget --recursive --no-clobber --page-requisites --html-extension --convert-links --restrict-file-names=windows --domains yeeyan.org --no-parent source.yeeyan.org
As an alternative to wget there is also httrack whose sole purpose is to copy websites and thus may suit you better. It also has a GUI.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With