I want to grab the HTTP status code once it raises a URLError exception:
I tried this but didn't help:
except URLError, e:
logger.warning( 'It seems like the server is down. Code:' + str(e.code) )
URLError. The handlers raise this exception (or derived exceptions) when they run into a problem. It is a subclass of OSError . reason. The reason for this error.
Simple urllib2 scripturlopen('http://python.org/') print "Response:", response # Get the URL. This gets the real URL. print "The URL is: ", response. geturl() # Getting the code print "This gets the code: ", response.
Just catch HTTPError , handle it, and if it's not Error 404, simply use raise to re-raise the exception. See the Python tutorial. can i do urllib2. urlopen("*") to handle any 404 errors and route them to my 404.
In the event of a network problem (e.g. DNS failure, refused connection, etc), Requests will raise a ConnectionError exception. In the event of the rare invalid HTTP response, Requests will raise an HTTPError exception. If a request times out, a Timeout exception is raised.
You shouldn't check for a status code after catching URLError
, since that exception can be raised in situations where there's no HTTP status code available, for example when you're getting connection refused errors.
Use HTTPError
to check for HTTP specific errors, and then use URLError
to check for other problems:
try: urllib2.urlopen(url) except urllib2.HTTPError, e: print e.code except urllib2.URLError, e: print e.args
Of course, you'll probably want to do something more clever than just printing the error codes, but you get the idea.
Not sure why you are getting this error. If you are using urllib2
this should help:
import urllib2 from urllib2 import URLError try: urllib2.urlopen(url) except URLError, e: print e.code
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With