Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

python how to run process in detached mode

here is a example:

from multiprocessing import Process
import time


def func():
    print('sub process is running')
    time.sleep(5)
    print('sub process finished')


if __name__ == '__main__':
    p = Process(target=func)
    p.start()
    print('done')

what I expect is that the main process will terminate right after it start a subprocess. But after printing out 'done', the terminal is still waiting....Is there any way to do this so that the main process will exit right after printing out 'done', instead of waiting for subprocess? I'm confused here because I'm not calling p.join()

like image 380
Ziqi Liu Avatar asked Mar 06 '18 04:03

Ziqi Liu


People also ask

How do you start a separate process in python?

To start a new process, or in other words, a new subprocess in Python, you need to use the Popen function call. It is possible to pass two parameters in the function call. The first parameter is the program you want to start, and the second is the file argument.

What is a Daemonic process Python?

Daemon processes in Python Python multiprocessing module allows us to have daemon processes through its daemonic option. Daemon processes or the processes that are running in the background follow similar concept as the daemon threads. To execute the process in the background, we need to set the daemonic flag to true.

What is process pool in Python?

Pool of process can be created and used in the same way as we have created and used the pool of threads. Process pool can be defined as the group of pre-instantiated and idle processes, which stand ready to be given work.

How does multiprocessing process work in Python?

multiprocessing is a package that supports spawning processes using an API similar to the threading module. The multiprocessing package offers both local and remote concurrency, effectively side-stepping the Global Interpreter Lock by using subprocesses instead of threads.


2 Answers

Python will not end if there exists a non-daemon process.

By setting, daemon attribute before start() call, you can make the process daemonic.

p = Process(target=func)
p.daemon = True  # <-----
p.start()
print('done')

NOTE: There will be no sub process finished message printed; because the main process will terminate sub-process at exit. This may not be what you want.

You should do double-fork:

import os
import time
from multiprocessing import Process


def func():
    if os.fork() != 0:  # <--
        return          # <--
    print('sub process is running')
    time.sleep(5)
    print('sub process finished')


if __name__ == '__main__':
    p = Process(target=func)
    p.start()
    p.join()
    print('done')
like image 123
falsetru Avatar answered Oct 24 '22 03:10

falsetru


Following the excellent answer from @falsetru, I wrote out a quick generalization in the form of a decorator.

import os
from multiprocessing import Process


def detachify(func):
    """Decorate a function so that its calls are async in a detached process.

    Usage
    -----

    .. code::
            import time

            @detachify
            def f(message):
                time.sleep(5)
                print(message)

            f('Async and detached!!!')

    """
    # create a process fork and run the function
    def forkify(*args, **kwargs):
        if os.fork() != 0:
            return
        func(*args, **kwargs)

    # wrapper to run the forkified function
    def wrapper(*args, **kwargs):
        proc = Process(target=lambda: forkify(*args, **kwargs))
        proc.start()
        proc.join()
        return

    return wrapper

Usage (copied from docstring):

import time

@detachify
def f(message):
    time.sleep(5)
    print(message)

f('Async and detached!!!')

Or if you like,

def f(message):
    time.sleep(5)
    print(message)


detachify(f)('Async and detached!!!')
like image 27
Nolan Conaway Avatar answered Oct 24 '22 02:10

Nolan Conaway