Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to download all links to .zip files on a given web page using wget/curl?

A page contains links to a set of .zip files, all of which I want to download. I know this can be done by wget and curl. How is it done?

like image 694
uyetch Avatar asked Nov 23 '12 17:11

uyetch


People also ask

How do I download a ZIP file using curl?

The basic syntax: Grab file with curl run: $ curl https://your-domain/file.pdf. Get file using ftp or sftp protocol: $ curl ftp://ftp-your-domain-name/file.tar.gz. You can set the output file name while downloading file with the curl, execute: $ curl -o file.

How do I download all files in a directory using curl?

To download multiple files at the same time, use –O followed by the URL to the file that you wish to download. If you curl without any options except for the URL, the content of the URL (whether it's a webpage, or a binary file, such as an image or a zip file) will be printed out to screen.


1 Answers

The command is:

wget -r -np -l 1 -A zip http://example.com/download/ 

Options meaning:

-r,  --recursive          specify recursive download. -np, --no-parent          don't ascend to the parent directory. -l,  --level=NUMBER       maximum recursion depth (inf or 0 for infinite). -A,  --accept=LIST        comma-separated list of accepted extensions. 
like image 157
creaktive Avatar answered Sep 27 '22 21:09

creaktive