Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Set value of string in a shared c_wchar_p in a subproccess?

I've a situation like this:

The main process generate some sub-process that they should write the result in a shared object in string and numeric types, for the numeric types there's no problem but with the string the value will be lost.

import multiprocessing as mp
from ctypes import Structure, c_double, c_wchar_p, c_int

# shared obj class
class SharedObj(Structure):
    _fields_ = [('name', c_wchar_p), ('val', c_double) ]

def run_mp( values , lock , s ) :
    for i in range( s , len( values ) , 2 ):
        lock.acquire()
        values[i].name = str( i ) # write the string value in the shared obj
        values[i].val = float( i )
        print( "tmp: %d" % i )
        lock.release()

def main():
    # creating the shared obj and mutex
    values = mp.Array(  SharedObj , [SharedObj() for i in range( 10 )] )
    lock_j = mp.Lock()

    # creating two sub-process form the function run_mp
    p1 = mp.Process( target=run_mp , args=( values , lock_j , 0 ))
    p2 = mp.Process( target=run_mp , args=( values , lock_j , 1 ))

    p1.start()
    p2.start()

    p1.join()
    p2.join()

    for v in  values:
        print()
        print( "res name: %s" % v.name )
        print( "res val: %f " % v.val )


if __name__ == '__main__':
    main()

As a result the field in the shared object containing the c_double is written in the field, but the string generated in the sub-processes rum-mp ( string values[i].name = str( i ) ) will be lost in the main process.

There is a method for saving the strings generated in sub-process?

The output of this code looks like:

Where the resulting string in the main process are completely random.

tmp: 0
tmp: 2
tmp: 3
tmp: 4

res name: ����羍����羍
res val: 0.000000
res name: ����羍����羍
res val: 1.000000
res name:
res val: 2.000000   ....
like image 755
Giggi Avatar asked Oct 06 '22 16:10

Giggi


2 Answers

Here is a slightly modified version of your code:

#!/usr/bin/env python

import multiprocessing as mp


def run_mp( values ):
    for c_arr, c_double in values:
        c_arr.value = 'hello foo'
        c_double.value = 3.14

def main():
    lock = mp.Lock()
    child_feed = []
    for i in range(10):
        child_feed.append((
            mp.Array('c', 15, lock = lock),
            mp.Value('d', 1.0/3.0, lock = lock)
        ))

    p1 = mp.Process( target=run_mp , args=(child_feed,))
    p2 = mp.Process( target=run_mp , args=(child_feed,))

    p1.start()
    p2.start()

    p1.join()
    p2.join()

    for c_arr, c_double in child_feed:
        print()
        print( "res name: %s" % c_arr.value )
        print( "res val: %f " % c_double.value )


if __name__ == '__main__':
    main()

Take a look at http://docs.python.org/library/multiprocessing.html There is an example of using Array of chars.

There is also mmap module alowing to share memory http://docs.python.org/library/mmap.html but with this you have to Sync access yourself possibly by semaphores. If you like more simple approach just use pipes.

like image 186
SanityIO Avatar answered Oct 10 '22 02:10

SanityIO


I didn't want to use multiprocessing.Array because of the obvious requirement of needing to specify its size ahead of time. The following works for me instead for having a multiprocessing-compatible unicode object. It is tested with Python 2.6 with multiple processes.

>>> shared_str = multiprocessing.Manager().Value(unicode, 'some initial value')
>>> shared_str.value
'some initial value'
>>> shared_str.value = 'some new value'
>>> shared_str.value
'some new value'

To deal with the author's specific question about sharing a string and a number, a serializable object to store these can perhaps be created and given to Value instead.

You may of course object to needing to use Manager for this purpose. If so, please provide an alternate solution.

like image 35
Asclepius Avatar answered Oct 10 '22 04:10

Asclepius