Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to solve Python memory leak when using urrlib2?

I'm trying to write a simple Python script for my mobile phone to periodically load a web page using urrlib2. In fact I don't really care about the server response, I'd only like to pass some values in the URL to the PHP. The problem is that Python for S60 uses the old 2.5.4 Python core, which seems to have a memory leak in the urrlib2 module. As I read there's seems to be such problems in every type of network communications as well. This bug have been reported here a couple of years ago, while some workarounds were posted as well. I've tried everything I could find on that page, and with the help of Google, but my phone still runs out of memory after ~70 page loads. Strangely the Garbege Collector does not seem to make any difference either, except making my script much slower. It is said that, that the newer (3.1) core solves this issue, but unfortunately I can't wait a year (or more) for the S60 port to come.

here's how my script looks after adding every little trick I've found:


import urrlib2, httplib, gc
while(true):
 url = "http://something.com/foo.php?parameter=" + value 
 f = urllib2.urlopen(url)
 f.read(1)
 f.fp._sock.recv=None # hacky avoidance
 f.close()
 del f
 gc.collect()
Any suggestions, how to make it work forever without getting the "cannot allocate memory" error? Thanks for advance, cheers, b_m

update: I've managed to connect 92 times before it ran out of memory, but It's still not good enough.

update2: Tried the socket method as suggested earlier, this is the second best (wrong) solution so far:


class UpdateSocketThread(threading.Thread):
  def run(self):
  global data
  while 1:
  url = "/foo.php?parameter=%d"%data
  s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
  s.connect(('something.com', 80))
  s.send('GET '+url+' HTTP/1.0\r\n\r\n')
  s.close()
  sleep(1)
I tried the little tricks, from above too. The thread closes after ~50 uploads (the phone has 50MB of memory left, obviously the Python shell has not.)

UPDATE: I think I'm getting closer to the solution! I tried sending multiple data without closing and reopening the socket. This may be the key since this method will only leave one open file descriptor. The problem is:


import socket
s=socket.socket(socket.AF_INET, socket.SOCK_STREAM)
socket.connect(("something.com", 80))
socket.send("test") #returns 4 (sent bytes, which is cool)
socket.send("test") #4
socket.send("test") #4
socket.send("GET /foo.php?parameter=bar HTTP/1.0\r\n\r\n") #returns the number of sent bytes, ok
socket.send("GET /foo.php?parameter=bar HTTP/1.0\r\n\r\n") #returns 0 on the phone, error on Windows7*
socket.send("GET /foo.php?parameter=bar HTTP/1.0\r\n\r\n") #returns 0 on the phone, error on Windows7*
socket.send("test") #returns 0, strange...
*: error message: 10053, software caused connection abort

Why can't I send multiple messages??

like image 720
b_m Avatar asked Nov 18 '10 11:11

b_m


2 Answers

Using the test code suggested by your link, I tested my Python installation and confirmed that it indeed leaks. But, if, as @Russell suggested, I put each urlopen in its own process, the OS should clean up the memory leaks. In my tests, memory, unreachable objects and open files all remain more or less constant. I split the code into two files:

connection.py

import cPickle, urllib2

def connectFunction(queryString):
    conn = urllib2.urlopen('http://something.com/foo.php?parameter='+str(queryString))
    data = conn.read()
    outfile = ('sometempfile'. 'wb')
    cPickle.dump(data, outfile)
    outfile.close()

if __name__ == '__main__':
    connectFunction(sys.argv[1])

###launcher.py
import subprocess, cPickle

#code from your link to check the number of unreachable objects

def print_unreachable_len():
    # check memory on memory leaks
    import gc
    gc.set_debug(gc.DEBUG_SAVEALL)
    gc.collect()
    unreachableL = []

    for it in gc.garbage:
        unreachableL.append(it)
    return len(str(unreachableL))

    #my code
    if __name__ == '__main__':        
        print 'Before running a single process:', print_unreachable_len()
        return_value_list = []
        for i, value in enumerate(values): #where values is a list or a generator containing (or yielding) the parameters to pass to the URL
             subprocess.call(['python', 'connection.py', str(value)])
             print 'after running', i, 'processes:', print_unreachable_len()
             infile = open('sometempfile', 'rb')
             return_value_list.append(cPickle.load(infile))
             infile.close()

Obviously, this is sequential, so you will only execute a single connection at a time, which may or may not be an issue for you. If it is, you will have to find a non-blocking way of communicating with the processes you're launching, but I'll leave that as an exercise for you.

EDIT: On re-reading your question, it seems you don't care about the server response. In that case, you can get rid of all the pickling related code. And obviously, you won't have the print_unreachable_len() related bits in your final code either.

like image 61
Chinmay Kanchi Avatar answered Sep 21 '22 15:09

Chinmay Kanchi


There exist a reference cycle in urllib2 created in urllib2.py:1216. The issue is on going and exists since 2009. https://bugs.python.org/issue1208304

like image 26
imih Avatar answered Sep 21 '22 15:09

imih