I have to check the status of 200 http URLs and find out which of these are broken links. The links are present in a simple text file (say URL.txt present in my ~ folder). I am using Ubuntu 14.04 and I am a Linux newbie. But I understand the bash shell is very powerful and could help me achieve what I want.
My exact requirement would be to read the text file which has the list of URLs and automatically check if the links are working and write the response to a new file with the URLs and their corresponding status (working/broken).
You can simply do a echo $? after executing the command/bash which will output the exit code of the program. Every command returns an exit status (sometimes referred to as a return status or exit code).
bash [filename] runs the commands saved in a file. $@ refers to all of a shell script's command-line arguments. $1 , $2 , etc., refer to the first command-line argument, the second command-line argument, etc. Place variables in quotes if the values might have spaces in them.
What about to add some parallelism to the accepted solution. Lets modify the script chkurl.sh
to be little easier to read and to handle just one request at a time:
#!/bin/bash
URL=${1?Pass URL as parameter!}
curl -o /dev/null --silent --head --write-out "$URL %{http_code} %{redirect_url}\n" "$URL"
And now you check your list using:
cat URL.txt | xargs -P 4 -L1 ./chkurl.sh
This could finish the job up to 4 times faster.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With