Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Getting stdout from a tcpdump subprocess after terminating it

I am running tcpdump in a subprocess like this:

pcap_process = subprocess.Popen(['tcpdump', '-s 0', '-w -', 'tcp'], 
                                  stdout=subprocess.PIPE, stderr=subprocess.PIPE)

The -w - argument is important: it tells tcpdump to print the resulting .pcap file to stdout.

I then go on to access a website using urllib.open(). After this is done, I would like to kill tcpdump and put whatever it printed into a string. I have tried the following:

pcap_process.terminate()
result = pcap_process.stdout.read()    # or readline(), etc.

But (unless I'm doing something wrong), that doesn't work; I killed the process, now there's nothing left to be read. If I use read() or communicate() before terminating, my script will just sit there and read on and on, waiting for tcpdump to finish (which it won't).

Is there a way to do this (preferably without loops)?

like image 374
sk29910 Avatar asked Aug 23 '11 15:08

sk29910


1 Answers

Instead of using tcpdump, it's often advisable to use PCAP directly, or Scapy.

If that isn't an option, simply call communicate after terminate - killing a process does not kill data in the pipes to it. However, don't forget to separate arguments in the creation of the subprocess ([,'-w', '-'] instead of [... , '-w -', ..]):

pcap_process = subprocess.Popen(['tcpdump', '-s', '0', '-w', '-', 'tcp'],
                                  stdout=subprocess.PIPE, stderr=subprocess.PIPE)
like image 54
phihag Avatar answered Nov 14 '22 21:11

phihag