Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Catching http errors

Tags:

python

urllib

how can I catch the 404 and 403 errors for pages in python and urllib(2), for example?

Are there any fast ways without big class-wrappers?

Added info (stack trace):

Traceback (most recent call last):
  File "test.py", line 3, in <module>
    page = urllib2.urlopen("http://localhost:4444")
  File "/usr/lib/python2.6/urllib2.py", line 126, in urlopen
    return _opener.open(url, data, timeout)
  File "/usr/lib/python2.6/urllib2.py", line 391, in open
    response = self._open(req, data)
  File "/usr/lib/python2.6/urllib2.py", line 409, in _open
    '_open', req)
  File "/usr/lib/python2.6/urllib2.py", line 369, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.6/urllib2.py", line 1161, in http_open
    return self.do_open(httplib.HTTPConnection, req)
  File "/usr/lib/python2.6/urllib2.py", line 1136, in do_open
    raise URLError(err)
urllib2.URLError: <urlopen error [Errno 111] Connection refused>
like image 636
Max Frai Avatar asked Jul 15 '10 14:07

Max Frai


2 Answers

import urllib2 
try:
   page = urllib2.urlopen("some url")
except urllib2.HTTPError, err:
   if err.code == 404:
       print "Page not found!"
   elif err.code == 403:
       print "Access denied!"
   else:
       print "Something happened! Error code", err.code
except urllib2.URLError, err:
    print "Some other error happened:", err.reason

In your case, the error happens already before the HTTP connection could be built - therefore you need to add another error handler that catches URLError. But this has nothing to do with 404 or 403 errors.

like image 54
Tim Pietzcker Avatar answered Sep 30 '22 07:09

Tim Pietzcker


req = urllib2.Request('url')
>>> try:
>>>     urllib2.urlopen(req)
>>> except urllib2.URLError, e:
>>>     print e.code
>>>     print e.read()
like image 40
0xAX Avatar answered Sep 30 '22 05:09

0xAX