I am using ndb to write a profiling model that logs some data per application request. Each request calls a ndb request by ndb.put_async to log the data, while the client do not care about the result. In essence, I do not want the application request to wait for saving statistics data for profiling.
However, I was confused about the explanation from the official documentation. If an application request has finished before the ndb request finishes, would the ndb request still be guaranteed to finish? The documentation indicates that
if the request handler exists too early, the put might never happen
Under what criteria would this happen? Does this mean that regardless of whether a user care about the result, future.get_result needs to be called anyway just to make sure the ndb request is performed?
The original documentation (https://developers.google.com/appengine/docs/python/ndb/async) says:
In this example, it's a little silly to call future.get_result: the application never uses the result from NDB. That code is just in there to make sure that the request handler doesn't exit before the NDB put finishes; if the request handler exits too early, the put might never happen. As a convenience, you can decorate the request handler with @ndb.toplevel. This tells the handler not to exit until its asynchronous requests have finished. This in turn lets you send off the request and not worry about the result.
If an application request has finished before the ndb request finishes, would the ndb request still be guaranteed to finish?
No.
Does this mean that regardless of whether a user care about the result, future.get_result needs to be called anyway just to make sure the ndb request is performed?
Basically yes, but you can use ndb.toplevel decorator for the convenience so that you don't have to wait for the result explicitly. That said, I don't think this is what you want.
Probably taskqueue is what you want. Please check it out.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With