I have a group of 60 objects, each of which i want to have its own threading.Thread
. In order to break this down even further because of Python's locking and whatnot, I wanted to spawn sub-processes (using multiprocessing.Process
), and use 6 Threads
per Process
. I broke my objects out into a two-dimensional list
to make them easier to loop through, such that obj[] indices represent the Process
number and each element in obj[][] is one of the objects I am working with as Threads
. So here's the breakdown:
# break the objects out into my 2D list
obj= []
for i in all_obj:
if len(obj) == 0 or len(obj[len(obj)-1]) > 5:
obj.append([])
obj[len(obj)-1].append(i)
# spawn processes
processes = []
for i in obj:
processes.append(Process(target=proc_run,args=(i))
processes[len(processes)-1].start()
# process target
def proc_run(my_objs):
threads = []
for ad in my_objs:
threads.append(Thread(target=thread_run,args=(ad))
threads[len(threads)-1].start()
# thread target
def thread_run(my_obj):
for i in range(1,21):
## do some stuff with the object here
pass
logging.info("Thread for object <%s> finished."%(my_obj.prop))
The problem is that the threads are not actually spawning unless I add the join()
after the start()
call. Since this eliminates my desire to multithread (i could just use a for
loop and accomplish the same thing), I'm not entirely sure what to do.
I'm a total noob when it comes to this threading stuff, so the more dumbed-down you can make your answers, the easier it will be for all involved. Thanks.
Since Python basically uses threading to spawn the processes, I just decided to use 60 processes. This should allow for (basically) the same thing I am aiming for; it will just make the processes tab in task manager blow up a bit. ;)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With