There is an online HTTP directory that I have access to. I have tried to download all sub-directories and files via wget
. But, the problem is that when wget
downloads sub-directories it downloads the index.html
file which contains the list of files in that directory without downloading the files themselves.
Is there a way to download the sub-directories and files without depth limit (as if the directory I want to download is just a folder which I want to copy to my computer).
"Download Master" is an extension for Google Chrome that works great for downloading from directories. You can choose to filter which file-types to download, or download the entire directory. Show activity on this post. You can use this Firefox addon to download all files in HTTP Directory.
Yes, it is possible. Sometimes. When you browse to a webpage (Say to http://demo.domain.tld/testdir/index.html ) it will open the file you specified (in this case `index. html).
Solution:
wget -r -np -nH --cut-dirs=3 -R index.html http://hostname/aaa/bbb/ccc/ddd/
Explanation:
-r
: recursively -np
: not going to upper directories, like ccc/…
-nH
: not saving files to hostname folder --cut-dirs=3
: but saving it to ddd by omitting
first 3 folders aaa, bbb, ccc
-R index.html
: excluding index.html
files Reference: http://bmwieczorek.wordpress.com/2008/10/01/wget-recursively-download-all-files-from-certain-directory-listed-by-apache/
I was able to get this to work thanks to this post utilizing VisualWGet. It worked great for me. The important part seems to be to check the -recursive
flag (see image).
Also found that the -no-parent
flag is important, othewise it will try to download everything.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With