Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to gracefully exit program using ProcessPoolExecutor?

Take, for example, the following program:

import asyncio
from concurrent.futures import ProcessPoolExecutor


def process():
    print('processed')

async def main(loop, executor):
    await loop.run_in_executor(executor, process)
    await asyncio.sleep(60.0)

executor = ProcessPoolExecutor()
loop = asyncio.get_event_loop()
try:
    loop.run_until_complete(main(loop, executor))
except KeyboardInterrupt:
    pass
finally:
    executor.shutdown()

If I hit Ctrl + C while the program is running, I get a really message traceback as it exits:

processed
^CProcess Process-3:
Process Process-4:
Process Process-2:
Traceback (most recent call last):
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versions/3.5/lib/pytho
n3.5/multiprocessing/process.py", line 254, in _bootstrap
    self.run()
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versions/3.5/lib/pytho
n3.5/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versions/3.5/lib/pytho
n3.5/concurrent/futures/process.py", line 169, in _process_worker
    call_item = call_queue.get(block=True)
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versions/3.5/lib/pytho
n3.5/multiprocessing/queues.py", line 93, in get
    with self._rlock:
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versions/3.5/lib/pytho
n3.5/multiprocessing/synchronize.py", line 96, in __enter__
    return self._semlock.__enter__()
Traceback (most recentTraceback (most recent call last):
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versions/3.5/lib/pytho
n3.5/multiprocessing/process.py", line 254, in _bootstrap
    self.run()
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versions/3.5/lib/pytho
n3.5/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versions/3.5/lib/pytho
n3.5/concurrent/futures/process.py", line 169, in _process_worker
    call_item = call_queue.get(block=True)
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versions/3.5/lib/pytho
n3.5/multiprocessing/queues.py", line 94, in get
    res = self._recv_bytes()
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versions/3.5/lib/pytho
n3.5/multiprocessing/connection.py", line 216, in recv_bytes
    buf = self._recv_bytes(maxlength)
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versi call last):
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versions/3.5/lib/pytho
n3.5/multiprocessing/process.py", line 254, in _bootstrap
    self.run()
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versions/3.5/lib/pytho
n3.5/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versions/3.5/lib/pytho
n3.5/concurrent/futures/process.py", line 169, in _process_worker
    call_item = call_queue.get(block=True)
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versions/3.5/lib/pytho
n3.5/multiprocessing/queues.py", line 93, in get
    with self._rlock:
  File "/usr/local/Cellar/python3/3.5.0/Frameworks/Python.framework/Versions/3.5/lib/pytho
n3.5/multiprocessing/synchronize.py", line 96, in __enter__
    return self._semlock.__enter__()
KeyboardInterrupt
..... (It goes on for a while longer)

Is there a more graceful way to handle KeyboardInterrupt in a program using a multiprocessing pool?

like image 219
Theron Luhn Avatar asked Mar 15 '23 08:03

Theron Luhn


1 Answers

Not sure if this is the correct (or the only) solution, but I usually add an explicit SIGINT signal handler rather than relying on the default behaviour of KeyboardInterrupt being raised by the interpreter on SIGINT. This gives you a bit more control and hopefully avoid unintended effects.

Update with @germn's suggestion:

import asyncio
import signal
from concurrent.futures import ProcessPoolExecutor

def shutdown(loop, executor):
    executor.shutdown()
    for task in asyncio.Task.all_tasks():
        task.cancel()
    loop.stop()

def process():
    print('processed')

async def main(loop, executor):
    await loop.run_in_executor(executor, process)
    loop.create_task(asyncio.sleep(120))
    loop.create_task(asyncio.sleep(12))
    loop.create_task(asyncio.sleep(130))
    await asyncio.sleep(60.0)

executor = ProcessPoolExecutor()
loop = asyncio.get_event_loop()
loop.add_signal_handler(signal.SIGINT, shutdown, loop, executor)
loop.run_until_complete(main(loop, executor))
loop.close()
like image 139
Jashandeep Sohi Avatar answered Apr 01 '23 07:04

Jashandeep Sohi