Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to retry urlfetch.fetch a few more times in case of error?

Quite often GAE is not able to upload the file and I am getting the following error:

ApplicationError: 2
Traceback (most recent call last):
  File "/base/python_runtime/python_lib/versions/1/google/appengine/ext/webapp/__init__.py", line 636, in __call__
    handler.post(*groups)
  File "/base/data/home/apps/picasa2vkontakte/1.348093606241250361/picasa2vkontakte.py", line 109, in post
    headers=headers
  File "/base/python_runtime/python_lib/versions/1/google/appengine/api/urlfetch.py", line 260, in fetch
    return rpc.get_result()
  File "/base/python_runtime/python_lib/versions/1/google/appengine/api/apiproxy_stub_map.py", line 592, in get_result
    return self.__get_result_hook(self)
  File "/base/python_runtime/python_lib/versions/1/google/appengine/api/urlfetch.py", line 355, in _get_fetch_result
    raise DownloadError(str(err))
DownloadError: ApplicationError: 2

How should I perform retries in case of such error?

        try:
            result = urlfetch.fetch(url=self.request.get('upload_url'), 
                                    payload=''.join(data),
                                    method=urlfetch.POST,
                                    headers=headers
                                    )
        except DownloadError:
            # how to retry 2 more times?
        # and how to verify result here?
like image 470
LA_ Avatar asked May 04 '11 12:05

LA_


1 Answers

If you can, move this work into the task queue. When tasks fail, they retry automatically. If they continue to fail, the system gradually backs off retry frequency to as slow as once-per hour. This is an easy way to handle API requests to rate-limited services without implementing one-off retry logic.

If you really need to handle requests synchronously, something like this should work:

for i in range(3):
  try:
    result = urlfetch.fetch(...)
    # run success conditions here
    break
  except DownloadError:
    #logging.debug("urlfetch failed!")
    pass

You can also pass deadline=10 to urlfetch.fetch to double the default timeout deadline.

like image 106
Drew Sears Avatar answered Nov 14 '22 19:11

Drew Sears