Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

subprocess.check_output without high memory usage

In my current project I have a webserver that calls Linux commands to get information that is then displayed on the website. The problem I have with that is that the webserver runs on a tiny embedded device (it is basically a configuration tool for the device) that has only 256 MB of RAM. The webserver itself does take more than half of the free RAM that I have on that device.

Now when I try to use subprocess.check_output() to call a command the fork shortly doubles the RAM usage (because it clones the parent process or something, as far as I understand) and thus crashes the whole thing with an "Out of Memory", though the called process is quite tiny.

Since the device uses pretty cheap flash chips that have proven to fail if overused I don't want to use any swap solutions or other solutions that are based on increasing the virtual memory.

What I tried to do so far is to Popen a sh session at the start of the program when it is still low on memory usage and then write the commands to that sh session and read the output. This kinda works, but it is quite unstable, since a wrong "exit" or something therelike can crash the whole thing.

Is there any solution similar to subprocess.check_output() that doesn't double my memory usage?

like image 433
Dakkaron Avatar asked Jun 15 '15 14:06

Dakkaron


1 Answers

So with the help of J.F. Sebastian I figured it out.

This is the code I used in the end:

from multiprocessing import Process, Queue
from subprocess import check_output, CalledProcessError

def cmdloop(inQueue,outQueue):
    while True:
        command = inQueue.get()
        try:
            result = check_output(command,shell=True)
        except CalledProcessError as e:
            result = e

        outQueue.put(result)

inQueue = Queue()
outQueue = Queue()
cmdHostProcess = Process(target=cmdloop, args=(inQueue,outQueue,))
cmdHostProcess.start()

def callCommand(command):
    inQueue.put(command)
    return outQueue.get()

def killCmdHostProcess():
    cmdHostProcess.terminate()

In Python 3.4+ I could have used multiprocessing.set_start_method('forkserver'), but since this runs on Python 2.7 this is sadly not available.

Still this reduces my memory usage by a long shot and removes the problem in a clean way. Thanks a lot for the help!

like image 154
Dakkaron Avatar answered Sep 16 '22 16:09

Dakkaron