Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to `wget` a list of URLs in a text file?

Tags:

text

wget

Let's say I have a text file of hundreds of URLs in one location, e.g.

http://url/file_to_download1.gz http://url/file_to_download2.gz http://url/file_to_download3.gz http://url/file_to_download4.gz http://url/file_to_download5.gz .... 

What is the correct way to download each of these files with wget? I suspect there's a command like wget -flag -flag text_file.txt

like image 649
ShanZhengYang Avatar asked Dec 06 '16 01:12

ShanZhengYang


People also ask

How do I download a list of files from wget?

In order to download a file using Wget, type wget followed by the URL of the file that you wish to download. Wget will download the file in the given URL and save it in the current directory.

What is wget spider?

The wget tool is essentially a spider that scrapes / leeches web pages but some web hosts may block these spiders with the robots. txt files. Also, wget will not follow links on web pages that use the rel=nofollow attribute. You can however force wget to ignore the robots.

How do I use wget recursively?

To use Wget to recursively download using FTP, simply change https:// to ftp:// using the FTP directory. Wget recursive download options: --recursive. download recursively (and place in recursive folders on your PC)


1 Answers

Quick man wget gives me the following:

[..]

-i file

--input-file=file

Read URLs from a local or external file. If - is specified as file, URLs are read from the standard input. (Use ./- to read from a file literally named -.)

If this function is used, no URLs need be present on the command line. If there are URLs both on the command line and in an input file, those on the command lines will be the first ones to be retrieved. If --force-html is not specified, then file should consist of a series of URLs, one per line.

[..]

So: wget -i text_file.txt

like image 184
dim-0 Avatar answered Sep 24 '22 00:09

dim-0