I have a question about how the event loop in python's asyncio
module manages outstanding tasks. Consider the following code:
import asyncio
@asyncio.coroutine
def a():
for i in range(0, 3):
print('a.' + str(i))
yield
@asyncio.coroutine
def b():
for i in range(0, 3):
print('b.' + str(i))
yield
@asyncio.coroutine
def c():
for i in range(0, 3):
print('c.' + str(i))
yield
tasks = [
asyncio.Task(a()),
asyncio.Task(b()),
asyncio.Task(c()),
]
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait([t1, t2, t3]))
Running this will print:
a.0
b.0
c.0
a.1
b.1
c.1
a.2
b.2
c.2
Notice that it always prints out 'a' then 'b' then 'c'. I'm guessing that no matter how many iterations each coroutine goes through it will always print in that order. So you'd never see something like
b.100
c.100
a.100
Coming from a node.js background, this tells me that the event loop here is maintaining a queue internally that it uses to decide which task to run next. It initially puts a()
at the front of the queue, then b()
, then c()
since that's the order of the tasks in the list passed to asyncio.wait()
. Then whenever it hits a yield statement it puts that task at the end of the queue. I guess in a more realistic example, say if you were doing an async http request, it would put a()
back on the end of the queue after the http response came back.
Can I get an amen on this?
The asyncio. gather() runs multiple asynchronous operations and wraps a coroutine as a task. The asyncio. gather() returns a tuple of results in the same order of awaitables.
What is guaranteed is that the results of the awaitables passed to gather will come in the same order as the corresponding awaitables.
How many times should Asyncio run () be called? It should be used as a main entry point for asyncio programs, and should ideally only be called once. New in version 3.7.
Now, calling get_event_loop() is deprecated and instead you should use asyncio. run() . But now, Python emits warnings because asyncio. get_event_loop() is deprecated.
Currently your example doesn't include any blocking I/O code. Try this to simulate some tasks:
import asyncio
@asyncio.coroutine
def coro(tag, delay):
for i in range(1, 8):
print(tag, i)
yield from asyncio.sleep(delay)
loop = asyncio.get_event_loop()
print("---- await 0 seconds :-) --- ")
tasks = [
asyncio.Task(coro("A", 0)),
asyncio.Task(coro("B", 0)),
asyncio.Task(coro("C", 0)),
]
loop.run_until_complete(asyncio.wait(tasks))
print("---- simulate some blocking I/O --- ")
tasks = [
asyncio.Task(coro("A", 0.1)),
asyncio.Task(coro("B", 0.3)),
asyncio.Task(coro("C", 0.5)),
]
loop.run_until_complete(asyncio.wait(tasks))
loop.close()
As you can see, coroutines are scheduled as needed, and not in order.
DISCLAIMER For at least v3.9 with the default implementation this appears to be true. However, the inner workings of the event loop are not public interface and thus may be changed with new versions. Additionally, asyncio allows for BaseEventLoop
implementation to be substituted, which may change its behavior.
When a Task
object is created, it calls loop.call_soon
to register its _step
method as a callback. The _step
method actually does the work of calling your coroutine with calls to send()
and processing the results.
In BaseEventLoop
, loop.call_soon
places the _step
callback at the end of a _ready
list of callbacks. Each run of the event loop, iterates the list of _ready
callbacks in a FIFO order and calls them. Thus, for the initial run of tasks, they are executed in the order they are created.
When the task awaits
or yield
s a future, it really depends on the nature of that future when the task's _wakeup
method get put into the queue.
Also, note that other callbacks can be registered in between creation of tasks. While it is true that if TaskA
is created before TaskB
, the initial run of TaskA
will happen before TaskB
, there could still be other callbacks that get run in between.
Last, the above behavior is also for the default Task
class that comes with asyncio
. Its possible however to specify a custom task factory and use an alternative task implementation which could also change this behavior.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With