I'm aware that urllib2
is available on Google App Engine as a wrapper of Urlfetch and, as you know, Universal Feedparser uses urllib2.
Do you know any method to set a timeout on urllib2?
Is the timeout
parameter on urllib2 been ported on Google App Engine version?
I'm not interested in method like:
rssurldata = urlfetch(rssurl, deadline=..)
feedparser.parse(rssurldata)
There's no simple way to do this, as the wrapper doesn't provide a way to pass through the timeout value, to the best of my knowledge. One hackish option would be to monkeypatch the urlfetch API:
old_fetch = urlfetch.fetch
def new_fetch(url, payload=None, method=GET, headers={},
allow_truncated=False, follow_redirects=True,
deadline=10.0, *args, **kwargs):
return old_fetch(url, payload, method, headers, allow_truncated,
follow_redirects, deadline, *args, **kwargs)
urlfetch.fetch = new_fetch
I prefer this. It's more dynamic for GAE API updates.
# -*- coding: utf-8 -*-
from google.appengine.api import urlfetch
import settings
def fetch(*args, **kwargs):
"""
Base fetch func with default deadline settings
"""
fetch_kwargs = {
'deadline': settings.URL_FETCH_DEADLINE
}
fetch_kwargs.update(kwargs)
return urlfetch.fetch(
*args, **fetch_kwargs
)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With