I have multiple worker processes reading from the same multiprocessing.queue()
. Each worker process only read contents belongs to itself, and must leave the other contents untouched. So basically the worker process must first check the queue contents then decides whether to pop one item.
Is there any way to do this with multiprocessing.queue
?
A queue is a data structure on which items can be added by a call to put() and from which items can be retrieved by a call to get(). The multiprocessing. Queue provides a first-in, first-out FIFO queue, which means that the items are retrieved from the queue in the order they were added.
A peek function in the python queue is used to print the first element of the queue. It returns the item which is present at the front index of the queue. It will not remove the first element but print it.
A simple way to communicate between process with multiprocessing is to use a Queue to pass messages back and forth. Any pickle-able object can pass through a Queue. This short example only passes a single message to a single worker, then the main process waits for the worker to finish.
Yes, it is. From https://docs.python.org/3/library/multiprocessing.html#exchanging-objects-between-processes: Queues are thread and process safe.
you can always put back messages that you don't need (if order is not an issue)
def get_my_job():
while True:
job = q.get()
if job == 'mine':
return job
q.put(job)
time.sleep(random()/2) #preventing deadlocks...
If the order is important you may use multiple Queues for so each message type will be in its own Queue
queues = { 'queue4worker_type1': Queue(),
'queue4worker_type2': Queue(),
}
#each worker can now consume only messages for its wanted types ...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With