I know that by using -P
switch or @parallel
tag I can run task in parallel on multiple hosts.
I'm trying to execute multiple long running tasks in parallel on the same host:
@task
def task1():
# long running op
@task
def task2():
#long running op
@task
def task3():
#long running op
@task
def backup_all():
execute(task1)
execute(task2)
execute(task3)
How can I start task1, task2 and task3 in parallel on the same host by using fabric. I know that I can run multiple fab processes with different tasks, but I'm looking for a solution that involves fabric.
You have a number of ways to address this task. You could use bash/linux level job controls and run the tasks in the background with bg
and then wait
for them to complete. Documentation explaining this style of control mechanism.
If you still really wanted to use Fabric/Python for this, you'd likely need to use the job_queue
that's already in the lib and sorta write your own queue to push these into, or read up on multiprocessing and just do some simple python forking. Though that is essentially all the job_queue
is doing.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With