Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python multiprocessing load balancer

Short question: Is it possible to have N work processes and a balancer process that will find worker that does nothing at this time and pass UnitOfWork to it?

Long question: Imagine class like this, witch will be subclassed for certain tasks:

class UnitOfWork:
  def __init__(self, **some_starting_parameters):
    pass
  def init(self):
    # open connections, etc.
  def run(self):
    # do the job

Start the balancer and worker process:

balancer = LoadBalancer()
workers  = balancer.spawn_workers(10)

Deploy work (balancer should find a lazy worker, and pass a task to it, or else if every worker is busy, add UOW to queue and wait till free worker):

balancer.work(UnitOfWork(some=parameters))
# internally, find free worker, pass UOW, ouw.init() + ouw.run()

Is this possible (or is it crazy)?

PS I'm familiar with multiprocessing Process class, and process pools, but:

  • Every Process instance starts a process (yep :) ) - I want fixed num of workers
  • I want Process instance that can make generic work
like image 435
canni Avatar asked Nov 29 '25 10:11

canni


1 Answers

I suggest you take a look at multiprocessing.Pool() because I believe it exactly solves your problem. It runs N "worker processes" and as each worker finishes a task, another task is provided. And there is no need for "poison pills"; it is very simple.

I have always used the .map() method on the pool.

Python multiprocessing.Pool: when to use apply, apply_async or map?

EDIT: Here is an answer I wrote to another question, and I used multiprocessing.Pool() in my answer.

Parallel file matching, Python

like image 80
steveha Avatar answered Dec 01 '25 22:12

steveha



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!