I am writing a script to launch a load generation experiment on several hosts. I could write a bash script to start multiple ssh sessions, but I was hoping to use something more structured. Since I use Python for most of my scripting, I thought Fabric looked like a good option.
The only problem is that I need to pass a small amount of host specific data with each command (really just an id or counter), and I would like to run them in parallel.
In other words, I would like to do something like the following, where host_num is different (possibly just incremented) for each host.
@parallel
def launch():
with cd('/working/dir'):
run("./start/script -id=%d", host_num)
Is this possible in Fabric? If not, is there another tool I could use to accomplish the same thing?
You could check against user / host. Each task knows about environment they currently run in:
env.hosts = ['[email protected]', '[email protected]']
@task
def test():
print '%(user)s@%(host)s' % (env)
if env.host == 'host1.com':
id = 1
elif ...
run('echo "%s"' % (id))
Feel free to write it in more elegant way :) (one suggestion being dictionaries used similar to case statements for the id lookup)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With