When I'm administering dozens of servers with Fabric, I often don't care about the specifics of the commands being run on each server, instead I want to collate small bits of information from each host and present it in summary at the end.
Does Fabric support this functionality itself? (I've searched the documentation to no avail, but perhaps I missed something).
Otherwise I suppose one could aggregate this information manually and then add an exit handler, but this feels like something that could be a common use case.
As an example, I have a some scripts that do some basic security checks on a number of servers, and I'd like to create a report at the end instead of scrolling through the output for each server. I don't want to restrict Fabric's output, since if there is an issue I want to scroll back to pinpoint it.
This is probably a little dated, now, and Fabric has certainly evolved a lot since you asked this question... though, as Morgan stated, you basically just need a wrapper script to contain the workhorse, and then it's "just Python" from there. This is briefly addressed in the execution model documentation.
For example, one way you might think of wrapping something like "uptime" (though, obviously, this can get a lot more complicated):
@parallel
def _get_uptime():
'''Retrieve and return uptime for each host'''
with hide('stdout'):
up = run( 'uptime' )
return( up.rstrip() )
@runs_once
def uptime_sorted():
'''System: System uptime (sorted) - Use parallel for best effect'''
print( cyan( "[%(host)s] Executing on %(host)s as %(user)s" % env ) )
system_uptimes = execute( _get_uptime )
for sys,up in sorted( system_uptimes.iteritems() ):
print "%s: %s" % ( sys, up )
This makes uptime_sorted your entry task. and _get_uptime do all the work of getting the data and returning it.
It's just python, so you can print whatever you'd like, as well as making your own decorator to wrap the task and spit that out. As it stands though there isn't anything in core nor contrib that does that.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With