Has anyone got any experience with the following exception when using GAE urlfetch?
DownloadError: ApplicationError: 2 timed out
I'm trying to send a HTTP POST request. Like so:
result = urlfetch.fetch('http://api.nathan.com:8080/Obj/',
method='POST',
payload=postdata,
deadline=10)
I've tried setting deadline to the max (10 seconds). The request from the command line (using curl or httplib2) takes about a second.
nchong@almond ~ $ time curl
-d "<Obj><a>1</a><b>n</b></Obj>"
http://api.nathan.com:8080/Obj/
agd1c2VyYXBpcgoLEgRTZXNzGAIM #< key returned by call
real 0m1.109s
user 0m0.003s
sys 0m0.009s
Here's the output from the dev appserver for the curl request (I'm using appengine-rest-server):
INFO __init__.py:819] adding models from module __main__
INFO __init__.py:867] added model Obj with type <class '__main__.Obj'>
INFO dev_appserver.py:3243] "POST /Obj HTTP/1.1" 200 -
INFO dev_appserver_index.py:205] Updating /path/to/index.yaml
Here's the output when I try to use urlfetch:
ERROR __init__.py:388] ApplicationError: 2 timed out
Traceback (most recent call last):
File "/path/to/webapp/__init__.py", line 507, in __call__
handler.get(*groups)
File "/path/to/myapp/main.py", line 62, in get
result = urlfetch.fetch(...)
File "/path/to/urlfetch.py", line 241, in fetch
return rpc.get_result()
File "/path/to/apiproxy_stub_map.py", line 501, in get_result
return self.__get_result_hook(self)
File "/path/to/urlfetch.py", line 325, in _get_fetch_result
raise DownloadError(str(err))
DownloadError: ApplicationError: 2 timed out
INFO dev_appserver.py:3243] "GET / HTTP/1.1" 500 -
INFO dev_appserver.py:3243] "POST /Obj/ HTTP/1.1" 200 -
The development web server is single-threaded. You can not make a request from your application running inside it to itself. Try running two instances on different ports.
By the way, this should not be a problem once it is deployed, as the actual AppEngine server is of course able to handle multiple simultaneous requests.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With