Possible Duplicate:
Return value from thread
I want to get the "free memory" of a bunch of servers like this:
def get_mem(servername):
res = os.popen('ssh %s "grep MemFree /proc/meminfo | sed \'s/[^0-9]//g\'"' % servername)
return res.read().strip()
since this can be threaded I want to do something like that:
import threading
thread1 = threading.Thread(target=get_mem, args=("server01", ))
thread1.start()
But now: how can I access the return value(s) of the get_mem functions?
Do I really need to go the full fledged way creating a class MemThread(threading.Thread)
and overwriting __init__
and __run__
?
pool import ThreadPool pool = ThreadPool(processes=1) async_result = pool. apply_async(foo, ('world', 'foo')) # tuple of args for foo # do some other stuff in the main process return_val = async_result. get() # get the return value from your function.
A thread cannot return values directly. The start() method on a thread calls the run() method of the thread that executes our code in a new thread of execution. The run() method in turn may call a target function, if configured.
You can run (post) many Runnable objects on this HandlerThread and each Runnable can set value in Message object, which can be received by UI Thread. Show activity on this post. Here is a cleaner approach, you just need a bit change to your existing code. The goal is to get the result from the the Thread.
You can always call the same function. That is not a problem. The problem might come if the functions are sharing data.
You could create a synchronised queue, pass it to the thread function and have it report back by pushing the result into the queue, e.g.:
def get_mem(servername, q):
res = os.popen('ssh %s "grep MemFree /proc/meminfo | sed \'s/[^0-9]//g\'"' % servername)
q.put(res.read().strip())
# ...
import threading, queue
q = queue.Queue()
threading.Thread(target=get_mem, args=("server01", q)).start()
result = q.get()
For the record, this is what I finally came up with (deviated from multiprocessing examples
from multiprocessing import Process, Queue
def execute_parallel(hostnames, command, max_processes=None):
"""
run the command parallely on the specified hosts, returns output of the commands as dict
>>> execute_parallel(['host01', 'host02'], 'hostname')
{'host01': 'host01', 'host02': 'host02'}
"""
NUMBER_OF_PROCESSES = max_processes if max_processes else len(hostnames)
def worker(jobs, results):
for hostname, command in iter(jobs.get, 'STOP'):
results.put((hostname, execute_host_return_output(hostname, command)))
job_queue = Queue()
result_queue = Queue()
for hostname in hostnames:
job_queue.put((hostname, command))
for i in range(NUMBER_OF_PROCESSES):
Process(target=worker, args=(job_queue, result_queue)).start()
result = {}
for i in range(len(hostnames)):
result.update([result_queue.get()])
# tell the processes to stop
for i in range(NUMBER_OF_PROCESSES):
job_queue.put('STOP')
return result
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With