I have a web directory where I store some config files. I'd like to use wget to pull those files down and maintain their current structure. For instance, the remote directory looks like:
http://mysite.com/configs/.vim/
.vim holds multiple files and directories. I want to replicate that on the client using wget. Can't seem to find the right combo of wget flags to get this done. Any ideas?
Typically, if you want to download directory & all subdirectories using wget command, you need to use -r option for recursive file transfer. Here is an example. You may also use –no-parent option to prevent wget from downloading parent directories.
To use Wget to recursively download using FTP, simply change https:// to ftp:// using the FTP directory. Wget recursive download options: --recursive. download recursively (and place in recursive folders on your PC)
wget will download all files & subfolders under your folder URL. If you want to selectively download on the target folder and not its subfolders, use -l1 option. If you want to download the folder and level 1 subfolder (e.g. www.example.com/products/category) use -l2 option.
You have to pass the -np
/--no-parent
option to wget
(in addition to -r
/--recursive
, of course), otherwise it will follow the link in the directory index on my site to the parent directory. So the command would look like this:
wget --recursive --no-parent http://example.com/configs/.vim/
To avoid downloading the auto-generated index.html
files, use the -R
/--reject
option:
wget -r -np -R "index.html*" http://example.com/configs/.vim/
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With