We can download multiple links using wget -i file_name where file_name is the file that contains all URLs we have to download.
I have 3 URLs in a file for example:
google.com
facebook.com
twitter.com
I request these URLs using wget -i file_name. But, how can we specify files names to store the result?
For example,we have to store result from google.com, facebook.com, twitter.com as response1, response2, response3 respectively. Thanks in advance.
I found similar question here
Use the -O file option.
E.g.
wget google.com
...
16:07:52 (538.47 MB/s) - `index.html' saved [10728]
vs.
wget -O foo.html google.com
...
16:08:00 (1.57 MB/s) - `foo.html' saved [10728]
Referring above I come up with a solution to write a simple shell script.
It's simply like executing
wget -O <URL> <filename>multiple times.
Create a file download_file.sh with contents like this
#!/bin/bash
wget https://www.google.com -O google_file
wget https://www.facebook.com -O fb_file
wget https://www.twitter.com -O twitter_file
Make the file executable
chmod +x download_file.sh
Run the file
./download_file.sh
All URL will be downloaded with filename defined in the download_file.sh. Also, you can tweak the shell script as your requirement like providing URLs from another file as the argument of this file.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With