Im trying to extract a line from wget's result but having trouble with it. This is my wget call:
$ wget -SO- -T 1 -t 1 http://myurl.com:15000/myhtml.html
Output:
--18:24:12-- http://xxx.xxxx.xxxx:15000/myhtml.html => `-' Resolving xxx.xxxx.xxxx... xxx.xxxx.xxxx Connecting to xxx.xxxx.xxxx|xxx.xxxx.xxxx|:15000... connected. HTTP request sent, awaiting response... HTTP/1.1 302 Found Date: Tue, 18 Nov 2008 23:24:12 GMT Server: IBM_HTTP_Server Expires: Thu, 01 Dec 1994 16:00:00 GMT Location: https://xxx.xxxx.xxxx/siteminderagent/... Content-Length: 508 Keep-Alive: timeout=10, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 Location: https://xxx.xxxx.xxxx//siteminderagent/... --18:24:13-- https://xxx.xxxx.xxxx/siteminderagent/... => `-' Resolving xxx.xxxx.xxxx... failed: Name or service not known.
if I do this:
$ wget -SO- -T 1 -t 1 http://myurl.com:15000/myhtml.html | egrep -i "302" <br/>
It doesnt return me the line that contains the string. I just want to check if the site or siteminder is up.
The output of wget you are looking for is written on stderr. You must redirect it:
$ wget -SO- -T 1 -t 1 http://myurl.com:15000/myhtml.html 2>&1 | egrep -i "302"
wget
prints the headers to stderr, not to stdout. You can redirect stderr to stdout as follows:
wget -SO- -T 1 -t 1 http://myurl.com:15000/myhtml.html 2>&1 | egrep -i "302"
The "2>&1" part says to redirect ('>') file descriptor 2 (stderr) to file descriptor 1 (stdout).
A bit enhanced version of already provided solution
wget -SO- -T 1 -t 1 http://myurl.com:15000/myhtml.html 2>&1 >/dev/null | grep -c 302
2>&1 >/dev/null
will trim off unneeded output. This way egrep will parse only wget`s stderr, what eliminates possibility to catch strings containing 302 from stdout (where html file itself outputted + download proces bar with resulting bytes count e.t.c.) :)
egrep -c
counts number of matched strings instead of simply output them. Enough to know how much strings egrep matched.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With