In using the Pool object from the multiprocessing module, is the number of processes limited by the number of CPU cores? E.g. if I have 4 cores, even if I create a Pool with 8 processes, only 4 will be running at one time?
You can ask for as many processes as you like. Any limit that may exist will be imposed by your operating system, not by multiprocessing
. For example,
p = multiprocessing.Pool(1000000)
is likely to suffer an ugly death on any machine. I'm trying it on my box as I type this, and the OS is grinding my disk to dust swapping out RAM madly - finally killed it after it had created about 3000 processes ;-)
As to how many will run "at one time", Python has no say in that. It depends on:
For CPU-bound tasks, it doesn't make sense to create more Pool
processes than you have cores to run them on. If you're trying to use your machine for other things too, then you should create fewer processes than cores.
For I/O-bound tasks, it may make sense to create a quite a few more Pool
processes than cores, since the processes will probably spend most their time blocked (waiting for I/O to complete).
Yes. Theoretically there is no limit on processes you can create, but an insane amount of processes started at once will cause death to the system because of the running out of memory. Note that processes occupy a much larger footprint than threads as they don't use shared space between them but use an individual space for each process.
so the best programming practice is to use semaphore restricted to the number of processors of your system. likely
pool = multiprocessing.Semaphore(4) # no of cpus of your system.
If you are not aware of the number of cores of your system or if you want to use the code in many systems, a generic code like the below will do...
pool = multiprocessing.Semaphore(multiprocessing.cpu_count())
#this will detect the number of cores in your system and creates a semaphore with that value.
P.S. But it is good to use number of cores-1 always.
Hope this helps :)
While there is no limit you can set, if you are looking to understand a convenient number to use for CPU bound processes (which I suspect you are looking for here), you can run the following:
>>> import multiprocessing
>>> multiprocessing.cpu_count()
1
Some good notes on limitations (especially in linux) are noted in the answer here:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With