I'm trying to use a shared string variable between my Python processes, but it seems that I'm doing something wrong since I'm getting coredumps and invalid memory values.
I use multiprocessing.Value
to create a ctypes.c_char_p
value and use the value
attribute to access it. In my understanding of the Python docs the value attribute should be synchronized, as long it is an instance of Value
(contrary of an instance of RawValue
). Is that correct so far?
I've created a short example to demonstrate my use of Value
and to show the inconsistency while executing:
from multiprocessing import Process, Value
from ctypes import c_char_p
def process(v):
while True:
val = v.value
print val
while val == v.value:
pass
v = Value(c_char_p, None)
p = Process(target=process, args=(v,))
p.start()
for i in range(1,999):
v.value = str(i)
p.terminate()
multiprocessing is a drop in replacement for Python's multiprocessing module. It supports the exact same operations, but extends it, so that all tensors sent through a multiprocessing. Queue , will have their data moved into shared memory and will only send a handle to another process.
Processes don't share memory with other processes. Threads share memory with other threads of the same process.
Python multiprocessing Process class is an abstraction that sets up another Python process, provides it to run code and a way for the parent application to control execution. There are two important functions that belongs to the Process class - start() and join() function.
I think the problem may have been caused by using Value(c_char_p)
to hold a string value. If you want a string, you should probably just use multiprocessing.Array(c_char)
.
From the Python-reference: https://docs.python.org/2/library/multiprocessing.html
your_string = Array('B', range(LENGHT))
You can take the identifier for the datatype from the table from the array module reference: https://docs.python.org/2/library/array.html
This is very similar in function to your example, though subtly different. Notice that the child process terminates on its own when it gets the None sentinel. The polling loop could consume less CPU if it were to use a timeout.
from multiprocessing import Process, Pipe
def show_numbers(consumer):
while True:
if consumer.poll():
val = consumer.recv()
if val==None:
break
print(val)
(consumer,producer) = Pipe(False)
proc = Process(target=show_numbers, args=(consumer,))
proc.start()
for i in range(1,999):
producer.send(str(i))
producer.send(None)
proc.join()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With