Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Make Singleton class in Multiprocessing

I create Singleton class using Metaclass, it working good in multithreadeds and create only one instance of MySingleton class but in multiprocessing, it creates always new instance

import multiprocessing

class SingletonType(type):
    # meta class for making a class singleton
    def __call__(cls, *args, **kwargs):

        try:
            return cls.__instance
        except AttributeError:
            cls.__instance = super(SingletonType, cls).__call__(*args, **kwargs)
            return cls.__instance

class MySingleton(object):
    # singleton class
    __metaclass__ = SingletonType

    def __init__(*args,**kwargs):
        print "init called"


def task():
    # create singleton class instance
    a = MySingleton()


# create two process
pro_1 = multiprocessing.Process(target=task)
pro_2 = multiprocessing.Process(target=task)

# start process
pro_1.start()
pro_2.start()

My output:

init called
init called

I need MySingleton class init method get called only once

like image 558
Kallz Avatar asked Jul 13 '17 09:07

Kallz


1 Answers

Each of your child processes runs its own instance of the Python interpreter, hence the SingletonType in one process doesn't share its state with those in another process. This means that a true singleton that only exists in one of your processes will be of little use, because you won't be able to use it in the other processes: while you can manually share data between processes, that is limited to only basic data types (for example dicts and lists).

Instead of relying on singletons, simply share the underlying data between the processes:

#!/usr/bin/env python3

import multiprocessing
import os


def log(s):
    print('{}: {}'.format(os.getpid(), s))


class PseudoSingleton(object):

    def __init__(*args,**kwargs):
        if not shared_state:
            log('Initializating shared state')
            with shared_state_lock:
                shared_state['x'] = 1
                shared_state['y'] = 2
            log('Shared state initialized')
        else:
            log('Shared state was already initalized: {}'.format(shared_state))


def task():
    a = PseudoSingleton()


if __name__ == '__main__':
    # We need the __main__ guard so that this part is only executed in
    # the parent

    log('Communication setup')
    shared_state = multiprocessing.Manager().dict()
    shared_state_lock = multiprocessing.Lock()

    # create two process
    log('Start child processes')
    pro_1 = multiprocessing.Process(target=task)
    pro_2 = multiprocessing.Process(target=task)
    pro_1.start()
    pro_2.start()

    # Wait until processes have finished
    # See https://stackoverflow.com/a/25456494/857390
    log('Wait for children')
    pro_1.join()
    pro_2.join()

    log('Done')

This prints

16194: Communication setup
16194: Start child processes
16194: Wait for children
16200: Initializating shared state
16200: Shared state initialized
16201: Shared state was already initalized: {'x': 1, 'y': 2}
16194: Done

However, depending on your problem setting there might be better solutions using other mechanisms of inter-process communication. For example, the Queue class is often very useful.

like image 140
Florian Brucker Avatar answered Oct 31 '22 07:10

Florian Brucker