Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to terminate multiprocessing Pool processes?

I'm working on a renderfarm, and I need my clients to be able to launch multiple instances of a renderer, without blocking so the client can receive new commands. I've got that working correctly, however I'm having trouble terminating the created processes.

At the global level, I define my pool (so that I can access it from any function):

p = Pool(2)

I then call my renderer with apply_async:

for i in range(totalInstances):
    p.apply_async(render, (allRenderArgs[i],args[2]), callback=renderFinished)
p.close()

That function finishes, launches the processes in the background, and waits for new commands. I've made a simple command that will kill the client and stop the renders:

def close():
    '''
        close this client instance
    '''
    tn.write ("say "+USER+" is leaving the farm\r\n")
    try:
        p.terminate()
    except Exception,e:
        print str(e)
        sys.exit()

It doesn't seem to give an error (it would print the error), the python terminates but the background processes are still running. Can anyone recommend a better way of controlling these launched programs?

like image 936
tk421storm Avatar asked May 06 '13 14:05

tk421storm


People also ask

How do processes pools work in multiprocessing?

Pool allows multiple jobs per process, which may make it easier to parallel your program. If you have a numbers jobs to run in parallel, you can make a Pool with number of processes the same number of as CPU cores and after that pass the list of the numbers jobs to pool. map.

What is a multiprocessing pool?

Functional Programming in Python In this lesson, you'll dive deeper into how you can use multiprocessing. Pool . It creates multiple Python processes in the background and spreads out your computations for you across multiple CPU cores so that they all happen in parallel without you needing to do anything.

How does pool Apply_async work?

The apply_async() function can be called directly to execute a target function in the process pool. The call will not block, but will instead immediately return an AsyncResult object that we can ignore if our function does not return a value.

Which is the method used to change the default way to create child processes in multiprocessing?

Python provides the ability to create and manage new processes via the multiprocessing. Process class. In multiprocessing programming, we may need to change the technique used to start child processes. This is called the start method.


1 Answers

I found solution: stop pool in separate thread, like this:

def close_pool():
    global pool
    pool.close()
    pool.terminate()
    pool.join()

def term(*args,**kwargs):
    sys.stderr.write('\nStopping...')
    # httpd.shutdown()
    stophttp = threading.Thread(target=httpd.shutdown)
    stophttp.start()
    stoppool=threading.Thread(target=close_pool)
    stoppool.daemon=True
    stoppool.start()


signal.signal(signal.SIGTERM, term)
signal.signal(signal.SIGINT, term)
signal.signal(signal.SIGQUIT, term)

Works fine and always i tested.

signal.SIGINT

Interrupt from keyboard (CTRL + C). Default action is to raise KeyboardInterrupt.

signal.SIGKILL

Kill signal. It cannot be caught, blocked, or ignored.

signal.SIGTERM

Termination signal.

signal.SIGQUIT

Quit with core dump.

like image 196
eri Avatar answered Sep 27 '22 19:09

eri