At scrapy.core.engine
ExecutionEngine method start
@defer.inlineCallbacks
def start(self):
"""Start the execution engine"""
assert not self.running, "Engine already running"
self.start_time = time()
yield self.signals.send_catch_log_deferred(signal=signals.engine_started)
self.running = True
self._closewait = defer.Deferred()
yield self._closewait
Why don't use self.signals.send_catch_log_deferred(signal=signals.engine_started)
directly but instead of a yield ?
Why don't use
self.signals.send_catch_log_deferred(signal=signals.engine_started)
directly but instead of a yield ?
Because send_catch_log_deferred
returns a Deferred
object. If you want to avoid a yield
there, then you should use send_catch_log
but the point of using send_catch_log_deferred
is to allow listeners to return Deferred
objects.
The signals that use send_catch_log
are not able to return Deferred
objects, hence they do not allow to perform asynchronous operations.
Edit: For a good introduction to inlineCallbacks
see: http://krondo.com/?p=2441
@defer.inlineCallbacks expects the decorated function to be a generator function and calling a generator function inside decorated function (even returning one) doesn't make the function, a generator function. An investigation:
def gen():
yield 1
def func(): return gen
import dis
dis.dis(gen)
2 0 LOAD_CONST 1 (1)
3 YIELD_VALUE
4 POP_TOP
5 LOAD_CONST 0 (None)
8 RETURN_VALUE
dis.dis(func)
1 0 LOAD_GLOBAL 0 (gen)
3 RETURN_VALUE
import inspect
inspect.isgeneratorfunction(gen)
True
inspect.isgeneratorfunction(func)
False
So only way to satisfy @defer.inlineCallbacks is either yield a deferred from self.signals.send_catch_log_deferred(signal=signals.engine_started) or from somewhere else.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With