Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

asyncio + multiprocessing + unix

I have a pet project with the following logic:

import asyncio, multiprocessing

async def sub_main():
    print('Hello from subprocess')

def sub_loop():
    asyncio.get_event_loop().run_until_complete(sub_main())

def start():
    multiprocessing.Process(target=sub_loop).start()

start()

If you run it, you'll see:

Hello from subprocess

That is good. But what I have to do is to make start() coroutine instead:

async def start():
    multiprocessing.Process(target=sub_loop).start()

To run it, I have to do something like that:

asyncio.get_event_loop().run_until_complete(start())

Here is the issue: when sub process is created, it gets the whole Python environment cloned, so event loop is already running there:

Process Process-1:
Traceback (most recent call last):
  File "/usr/lib/python3.5/multiprocessing/process.py", line 249, in _bootstrap
    self.run()
  File "/usr/lib/python3.5/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "test.py", line 7, in sub_loop
    asyncio.get_event_loop().run_until_complete(sub_main())
  File "/usr/lib/python3.5/asyncio/base_events.py", line 361, in run_until_complete
    self.run_forever()
  File "/usr/lib/python3.5/asyncio/base_events.py", line 326, in run_forever
    raise RuntimeError('Event loop is running.')
RuntimeError: Event loop is running.

I tried to destroy it on subprocess side with no luck but I think that the correct way is to prevent its sharing with subprocess though. Is it possible somehow?

UPDATE: Here is the full failing code:

import asyncio, multiprocessing

import asyncio.unix_events

async def sub_main():
    print('Hello from subprocess')

def sub_loop():
    asyncio.get_event_loop().run_until_complete(sub_main())


async def start():
    multiprocessing.Process(target=sub_loop).start()

asyncio.get_event_loop().run_until_complete(start())
like image 320
Grief Avatar asked Jul 05 '16 00:07

Grief


1 Answers

First, you should consider using loop.run_in_executor with a ProcessPoolExecutor if you plan to run python subprocesses from within the loop. As for your problem, you can use the event loop policy functions to set a new loop:

import asyncio
from concurrent.futures import ProcessPoolExecutor

async def sub_main():
    print('Hello from subprocess')

def sub_loop():
    loop = asyncio.new_event_loop()
    asyncio.set_event_loop(loop)
    loop.run_until_complete(sub_main())

async def start(executor):
    await asyncio.get_event_loop().run_in_executor(executor, sub_loop)

if __name__ == '__main__':
    executor = ProcessPoolExecutor()
    asyncio.get_event_loop().run_until_complete(start(executor))
like image 109
Vincent Avatar answered Oct 08 '22 08:10

Vincent