I'd like to write a python script to perform some very simple "agentless" monitoring of remote processes running on linux servers.
It would perform the following tasks, in psuedocode:
for each remoteIPAddress in listOfIPAddresses:
log into server@remoteIPAddress via ssh
execute the equivalent of a 'ps -ef' command
grep the result to make sure a particular process (by name) is still running
One way to do this is to have python call shell scripts in a subprocess and parse their output. That seems pretty inefficient. Is there a better way to do this via python libraries?
All I could find via research here and elsewhere was:
Thanks, and please go easy on me, it's my first question :-)
The Fabric library may be of interest to you.
Check out paramiko. You can use it to ssh into the server and run commands. You can then parse the results and do what you'd like with them.
Taking cues from the answers above, I investigated Fabric and found the following presentation particularly interesting/helpful. It is an overview of three libraries -- Fabric, Cuisine, and Watchdog -- for server monitoring and administration. For posterity:
Using Fabric, Cuisine, and Watchdog for server administration in Python
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With