Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python to emulate remote tail -f?

Tags:

python

ssh

tail

We have several application servers, and a central monitoring server.

We are currently running ssh with "tail -f" from the monitoring server to stream several text logfiles in realtime from the app servers.

The issue, apart from the brittleness of the whole approach is that killing the ssh process can sometimes leave zombie tail processes behind. We've mucked around with using -t to create pseudo-terminals, but it still sometimes leaves the zombie processes around, and -t is apparently also causing issues elsewhere with the job scheduling product we're using.

As a cheap-and-dirty solution until we can get proper centralised logging (Logstash and RabbitMQ, hopefully), I'm hoping to write a simple Python wrapper that will start ssh and "tail -f", still capture the output, but store the PID to a textfile on disk so we can kill the appropriate tail process later if need be.

I at first tried using subprocess.Popen, but then I hit issues with actually getting the "tail -f" output back in realtime (which then needs to be redirected to a file) - apparently there are going to be a host of blocking/buffer issues.

A few sources seemed to recommend using pexpect, or pxssh or something like that. Ideally I'd like to use just Python and it's included libraries, if possible - however, if a library is really the only way to do this, then I'm open to that.

Is there a nice easy way of getting Python to start up ssh with "tail -f", get the output in realtime printed to local STDOUT here (so I can redirect to a local file), and also saving the PID to a file to kill later? Or even if I don't use ssh with tail -f, some way of still streaming a remote file in (near) realtime that includes saving the PID to a file?

Cheers, Victor

EDIT: Just to clarify - we want the tail process to die when we kill the SSH process.

We want to start ssh and "tail -f" from the monitoring server, then when we Ctlr-C that, the tail process on the remote box should die as well - we don't want it to stay behind. Normally ssh with -t should fix it, but it isn't fully reliable, for reasons I don't understand, and it doesn't play nicely with our job scheduling.

Hence, using screen to keep the process alive at the other end is not what we want.

like image 408
victorhooi Avatar asked Oct 06 '11 20:10

victorhooi


3 Answers

I wrote a function that do that:

import paramiko
import time
import json

DEFAULT_MACHINE_USERNAME="USERNAME"
DEFAULT_KEY_PATH="DEFAULT_KEY_PATH"

def ssh_connect(machine, username=DEFAULT_MACHINE_USERNAME,
                key_filename=DEFAULT_KEY_PATH):
    ssh = paramiko.SSHClient()
    ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
    ssh.connect(hostname=machine, username=username, key_filename=key_filename)
    return ssh

def tail_remote_file(hostname, filepath, key_path=DEFAULT_KEY_PATH,
                     close_env_variable="CLOSE_TAIL_F", env_file='~/.profile'):
    ssh = ssh_connect(hostname, key_filename=key_path)

    def set_env_variable(to_value):
        to_value_str = "true" if to_value else "false"
        from_value_str = "false" if to_value else "true"
        ssh.exec_command('sed -i \'s/export %s=%s/export %s=%s/g\' %s' %
                         (close_env_variable, from_value_str,
                          close_env_variable, to_value_str, env_file))
        time.sleep(1)

    def get_env_variable():
        command = "source .profile; echo $%s" % close_env_variable
        stdin, stdout_i, stderr = ssh.exec_command(command)
        print(command)
        out = stdout_i.read().replace('\n', '')
        return out

    def get_last_line_number(lines_i, line_num):
        return int(lines_i[-1].split('\t')[0]) + 1 if lines_i else line_num

    def execute_command(line_num):
        command = "cat -n %s | tail --lines=+%d" % (filepath, line_num)
        stdin, stdout_i, stderr = ssh.exec_command(command)
        stderr = stderr.read()
        if stderr:
            print(stderr)
        return stdout_i.readlines()

    stdout = get_env_variable()
    if not stdout:
        ssh.exec_command("echo 'export %s=false' >> %s" %
                         (close_env_variable, env_file))
    else:
        ssh.exec_command(
            'sed -i \'s/export %s=true/export %s=false/g\' %s' %
            (close_env_variable, close_env_variable, env_file))
    set_env_variable(False)

    lines = execute_command(0)
    last_line_num = get_last_line_number(lines, 0)

    while not json.loads(get_env_variable()):
        for l in lines:
            print('\t'.join(t.replace('\n', '') for t in l.split('\t')[1:]))
        last_line_num = get_last_line_number(lines, last_line_num)
        lines = execute_command(last_line_num)
        time.sleep(1)

    ssh.close()
like image 62
Yonatan Kiron Avatar answered Sep 21 '22 00:09

Yonatan Kiron


The paramiko module supports connecting with via ssh with python.

http://www.lag.net/paramiko/

The pysftp has some examples of using it and the execute command method might be what your looking for. It will create a file like object of the command you execute. I can't say if it gives you live data though.

http://code.google.com/p/pysftp/

like image 24
McLeopold Avatar answered Nov 08 '22 16:11

McLeopold


I think the screen idea is the best idea, but if you're not wanting to ssh and you want a python script to do it. Here is a simple pythonic XMLRPC way of getting the info. It will only update when something has been appended to the file in question.

This is the client file. You tell this which file you want to read from and what computer its on.

#!/usr/bin/python
# This should be run on the computer you want to output the files
# You must pass a filename and a location
# filename must be the full path from the root directory, or relative path
# from the directory the server is running
# location must be in the form of http://location:port (i.e. http:localhost:8000)

import xmlrpclib, time, sys, os

def tail(filename, location):
   # connect to server
   s = xmlrpclib.ServerProxy(location)

   # get starting length of file
   curSeek = s.GetSize(filename)

   # constantly check
   while 1:
      time.sleep(1) # make sure to sleep

      # get a new length of file and check for changes
      prevSeek = curSeek

      # some times it fails if the file is being writter to,
      # we'll wait another second for it to finish
      try:
         curSeek = s.GetSize(filename)
      except:
         pass

      # if file length has changed print it
      if prevSeek != curSeek:
         print s.tail(filename, prevSeek),


def main():
   # check that we got a file passed to us
   if len(sys.argv) != 3 or not os.path.isfile(sys.argv[1]):
      print 'Must give a valid filename.'
      return

   # run tail function
   tail(sys.argv[1], sys.argv[2])

main()

This is the server you will run this on each computer that has a file you want to look at. Its nothing fancy. You can daemonize it if you want. You just run it, and you client should connect to it if you tell the client where it is and you have the right ports open.

#!/usr/bin/python
# This runs on the computer(s) you want to read the file from
# Make sure to change out the HOST and PORT variables
HOST = 'localhost'
PORT = 8000

from SimpleXMLRPCServer import SimpleXMLRPCServer
from SimpleXMLRPCServer import SimpleXMLRPCRequestHandler

import time, os

def GetSize(filename):
   # get file size
   return os.stat(filename)[6]

def tail(filename, seek):
   #Set the filename and open the file
   f = open(filename,'r')

   #Find the size of the file and move to the end
   f.seek(seek)
   return f.read()

def CreateServer():
   # Create server
   server = SimpleXMLRPCServer((HOST, PORT),
                               requestHandler=SimpleXMLRPCRequestHandler)

# register functions
   server.register_function(tail, 'tail')
   server.register_function(GetSize, 'GetSize')

   # Run the server's main loop
   server.serve_forever()

# start server
CreateServer()

Ideally you run the server once, then from the client run "python client.py sample.log http://somehost:8000" and it should start going. Hope that helps.

like image 2
Hoopdady Avatar answered Nov 08 '22 16:11

Hoopdady