In threading, we have something called "Thread Context", in which we can save some data (state) for accessing in a special thread. In asyncio, I need to save some state in current execution path, so that all consequent coroutines can access it. What is the solution? Note: I know each coroutine function is instantiated for an execution path in asyncio, but for some reason I can not save the state in function properties. (Although this method os not very good anyway)
A context manager is an object that defines a runtime context executing within the with statement.
Following are some methods of this class : context(): creates an empty context with no values in it. get(): returns the value of current variable if assigned, otherwise returns the default value if given, otherwise returns None. keys(): returns the list of all variables in the contextvars object.
One of the cool advantages of asyncio is that it scales far better than threading . Each task takes far fewer resources and less time to create than a thread, so creating and running more of them works well. This example just creates a separate task for each site to download, which works out quite well.
__enter__ and [__exit__] both are methods that are invoked on entry to and exit from the body of "the with statement" (PEP 343) and implementation of both is called context manager. the with statement is intend to hiding flow control of try finally clause and make the code inscrutable.
As of Python 3.7 you can make use of contextvars.ContextVar.
In the example below I declared request_id and set the value in some_outer_coroutine, then accessed it in some_inner_coroutine.
import asyncio import contextvars # declare context var request_id = contextvars.ContextVar('Id of request.') async def some_inner_coroutine(): # get value print('Processed inner coroutine of request: {}'.format(request_id.get())) async def some_outer_coroutine(req_id): # set value request_id.set(req_id) await some_inner_coroutine() # get value print('Processed outer coroutine of request: {}'.format(request_id.get())) async def main(): tasks = [] for req_id in range(1, 5): tasks.append(asyncio.create_task(some_outer_coroutine(req_id))) await asyncio.gather(*tasks) if __name__ == '__main__': asyncio.run(main())
Output:
Processed inner coroutine of request: 1 Processed outer coroutine of request: 1 Processed inner coroutine of request: 2 Processed outer coroutine of request: 2 Processed inner coroutine of request: 3 Processed outer coroutine of request: 3 Processed inner coroutine of request: 4 Processed outer coroutine of request: 4
There's also https://github.com/azazel75/metapensiero.asyncio.tasklocal, but you must be aware that tasks are often created internally by libraries and also by asyncio using ensure_future(a_coroutine)
and there's no actual way to track these new tasks and initialize their locals (maybe with those of the task that they are created from). (an "hack" whould be setting a loop.set_task_factory()
function with something that does the job, hoping that all code uses loop.create_task()
to create the tasks, which is not always true...)
Another issue is that if some of your code is executed inside a Future callback Task.current_task()
function which is used by both the libraries to select the right copy of locals to serve will always return None
...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With