Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Sharing a variable between processes

I have a downloader function that downloads multiple files parallely. I use multiprocessing.Pool.map_async in order to download different chunks of the same file. I would like to show a statusbar of the download. For this, I need to know the total bytes that has been already downloaded (total_bytes_dl).

    pool = multiprocessing.Pool(processes)
    mapObj = pool.map_async(f, args)

    while not mapObj.ready():
        status = r"%.2f MB / %.2f MB" % (total_bytes_dl / 1024.0 / 1024.0, filesize / 1024.0 / 1024.0,)
        status = status + chr(8)*(len(status)+1)
        print status,
        time.sleep(0.5)

Is there a way to set a variable that will be shared among all these processes AND the main process, so every process can append the amount of bytes that has just downloaded?

like image 861
iTayb Avatar asked Mar 24 '12 22:03

iTayb


People also ask

In which section process can use shared variable?

Critical Section groups can be used to synchronize access to shared variables. A Critical Section group allows only one process instance to execute the Critical Section group and its contents at a given time.

Does fork share variables?

No, they are not shared in any way which is visible to the programmer; the processes can modify their own copies of the variables independently and they will change without any noticable effect on the other process(es) which are fork() parents, siblings or descendents.


1 Answers

The solution was to intilize the new process and pass the shared ctypes value:

from ctypes import c_int
import dummy

shared_bytes_var = multiprocessing.Value(c_int)

def Func(...):
    ....
    pool = multiprocessing.Pool(initializer=_initProcess,initargs=(shared_bytes_var,))
    ....

def _initProcess(x):
  dummy.shared_bytes_var = x
like image 65
iTayb Avatar answered Sep 28 '22 07:09

iTayb