Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to config wget to retry more than 20?

Tags:

I used d4x to continue download, but it's obsolete from ubuntu. So I use flashgot with wget to continue download. But wget stops after 20 tries, I have to restart manually. Is there any conf file I could modify the retry times more than 20?

The wget CLI is automatic created by wget, so please don't tell me I could make options with the wget CLI.

like image 436
jon doe Avatar asked Jul 11 '15 03:07

jon doe


People also ask

How many times does Wget try?

The default is to retry 20 times, with the exception of fatal errors like "connection refused" or "not found" (404), which are not retried.

How do I stop Wget from retrying?

When interacting with the network, Wget can check for timeout and abort the operation if it takes too long. This prevents anomalies like hanging reads and infinite connects. The only timeout enabled by default is a 900-second read timeout. Setting a timeout to 0 disables it altogether.

What is Wget spider?

The wget tool is essentially a spider that scrapes / leeches web pages but some web hosts may block these spiders with the robots. txt files. Also, wget will not follow links on web pages that use the rel=nofollow attribute. You can however force wget to ignore the robots.

How do I download multiple files using Wget?

If you want to download multiple files at once, use the -i option followed by the path to a local or external file containing a list of the URLs to be downloaded. Each URL needs to be on a separate line. If you specify - as a filename, URLs will be read from the standard input.


2 Answers

Use the --tries option:

wget --tries=42 http://example.org/ 

Specify --tries=0 or --tries=inf for infinite retrying (default is 20 retries).

The default value also can be changed via config file, if that is your thing; open /etc/wgetrc and look there for:

# You can lower (or raise) the default number of retries when # downloading a file (default is 20). #tries = 20 

uncomment tries=20 and change it to what you want.

The default is to retry 20 times, with the exception of fatal errors like "connection refused" or "not found" (404), which are not retried

like image 121
Filip Bulovic Avatar answered Oct 07 '22 19:10

Filip Bulovic


If the default retry value can not meet your needs, it seems you are downloading from an unstable source.

The following option may also help a lot.

--retry-connrefused  --read-timeout=20 --timeout=15 --tries=0 --continue 

--retry-connrefused Force wget to retry even the server refuses requests, otherwise wget will stop to retry.

--waitretry=1 If you decide to retry many times, it's better to add some short period between each retry.

--timeout=15 unstable link always cause stopping of data flow, the default 900s timeout is too long. There is --dns-timeout, --connect-timeout, and --read-timeout. By specify --timeout, you update them all at the same time.

--tries=0 Make wget to retry infinity except fatal situations such as 404.

--continue resume download

like image 42
Sulisu Avatar answered Oct 07 '22 20:10

Sulisu