I'd like to know what guarantees python gives around when a event loop will switch tasks.
As I understand it async
/ await
are significantly different from threads in that the event loop does not switch task based on time slicing, meaning that unless the task yields (await
), it will carry on indefinitely. This can actually be useful because it is easier to manage critical sections under asyncio than with threading.
What I'm less clear about is something like the following:
async def caller():
while True:
await callee()
async def callee():
pass
In this example caller
is repeatedly await
. So technically it is yielding. But I'm not clear on whether this will allow other tasks on the event loop to execute because it only yields to callee
and that is never yielding.
That is if I awaited callee
inside a "critical section" even though I know it won't block, am I at risk of something else unexpected happening?
You are right to be wary. caller
yields from callee
, and yields to the event loop. Then the event loop decides which task to resume. Other tasks may (hopefully) be squeezed in between the calls to callee
. callee
needs to await an actual blocking Awaitable
such as asyncio.Future
or asyncio.sleep()
, not a coroutine, otherwise the control will not be returned to the event loop until caller
returns.
For example, the following code will finish the caller2
task before it starts working on the caller1
task. Because callee2
is essentially a sync function without awaiting a blocking I/O operations, therefore, no suspension point is created and caller2
will resume immediately after each call to callee2
.
import asyncio
import time
async def caller1():
for i in range(5):
await callee1()
async def callee1():
await asyncio.sleep(1)
print(f"called at {time.strftime('%X')}")
async def caller2():
for i in range(5):
await callee2()
async def callee2():
time.sleep(1)
print(f"sync called at {time.strftime('%X')}")
async def main():
task1 = asyncio.create_task(caller1())
task2 = asyncio.create_task(caller2())
await task1
await task2
asyncio.run(main())
Result:
sync called at 19:23:39
sync called at 19:23:40
sync called at 19:23:41
sync called at 19:23:42
sync called at 19:23:43
called at 19:23:43
called at 19:23:44
called at 19:23:45
called at 19:23:46
called at 19:23:47
But if callee2
awaits as the following, the task switching will happen even if it awaits asyncio.sleep(0)
, and the tasks will run concurrently.
async def callee2():
await asyncio.sleep(1)
print('sync called')
Result:
called at 19:22:52
sync called at 19:22:52
called at 19:22:53
sync called at 19:22:53
called at 19:22:54
sync called at 19:22:54
called at 19:22:55
sync called at 19:22:55
called at 19:22:56
sync called at 19:22:56
This behavior is not necessarily intuitive, but it makes sense considering that asyncio
was made to handle I/O operations and networking concurrently, not the usual synchronous python codes.
Another thing to note is: This still works if the callee
awaits a coroutine that, in turn, awaits a asyncio.Future
, asyncio.sleep()
, or another coroutine that await one of those things down the chain. The flow control will be returned to the event loop when the blocking Awaitable
is awaited. So the following also works.
async def callee2():
await inner_callee()
print(f"sync called at {time.strftime('%X')}")
async def inner_callee():
await asyncio.sleep(1)
TLDR: No. Coroutines and their respective keywords (await
, async with
, async for
) only enable suspension. Whether suspension occurs depends on the framework used, if at all.
Third-party async functions / iterators / context managers can act as checkpoints; if you see
await <something>
or one of its friends, then that might be a checkpoint. So to be safe, you should prepare for scheduling or cancellation happening there.[Trio documentation]
The await
syntax of Python is syntactic sugar around two fundamental mechanisms: yield
to temporarily suspend with a value, and return
to permanently exit with a value. These are the same that, say, a generator function coroutine can use:
def gencoroutine():
for i in range(5):
yield i # temporarily suspend
return 5 # permanently exit
Notably, return
does not imply a suspension. It is possible for a generator coroutine to never yield
at all.
The await
keyword (and its sibling yield from
) interacts with both the yield
and return
mechanism:
yield
s, await
"passes on" the suspension to its own caller. This allows to suspend an entire stack of coroutines that all await
each other.returns
s, await
catches the return value and provides it to its own coroutine. This allows to return a value directly to a "caller", without suspension.This means that await
does not guarantee that a suspension occurs. It is up to the target of await
to trigger a suspension.
By itself, an async def
coroutine can only return
without suspension, and await
to allow suspension. It cannot suspend by itself (yield
does not suspend to the event loop).
async def unyielding():
return 2 # or `pass`
This means that await
of just coroutines does never suspend. Only specific awaitables are able to suspend.
Suspension is only possible for awaitables with a custom __await__
method. These can yield
directly to the event loop.
class YieldToLoop:
def __await__(self):
yield # to event loop
return # to awaiter
This means that await
, directly or indirectly, of a framework's awaitable will suspend.
The exact semantics of suspending depend on the async framework in use. For example, whether a sleep(0)
triggers a suspension or not, or which coroutine to run instead, is up to the framework. This also extends to async iterators and context managers -- for example, many async context managers will suspend either on enter or exit but not both.
Trio
If you call an async function provided by Trio (
await <something in trio>
), and it doesn’t raise an exception, then it always acts as a checkpoint. (If it does raise an exception, it might act as a checkpoint or might not.)Asyncio
sleep()
always suspends the current task, allowing other tasks to run.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With