Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to go about fixing a memory leak in PHP

My PHP app has an import script that can import records.

At the moment, it is importing from a CSV file. It is reading each line of the CSV file, one line at a time using fgetcsv, and for each line it is doing a lot of processing on that record, including database queries, and then moving on to the next line. It shouldn't need to keep accumulating more memory.

After around 2500 records imported, PHP dies, saying that it has run over its memory limit (132 MB or so).

The CSV file itself is only a couple of megs - the other processing that happens does a lot of string comparisons, diffs, etc. I have a huge amount of code operating on it and it would be difficult to come up with a 'smallest reproducing sample'.

What are some good ways to go about finding and fixing such a problem?

Cause of problem found

I have a debug class which logs all my database queries during runtime. So those strings of SQL, some 30KB long, were staying in memory. I realise this isn't suitable for scripts designed to run for a long time.

There may be other sources of memory leaks, but I am fairly sure this is the cause of my problem.

like image 299
thomasrutter Avatar asked Jun 18 '09 01:06

thomasrutter


People also ask

Can PHP have memory leaks?

Memory leaks can happen in any language, including PHP. These memory leaks may happen in small increments that take time to accumulate, or in larger jumps that manifest quickly.

How does PHP manage memory?

PHP memory management functions are invoked by the MySQL Native Driver through a lightweight wrapper. Among others, the wrapper makes debugging easier. The various MySQL Server and the various client APIs differentiate between buffered and unbuffered result sets.


2 Answers

If you do in fact suspect that there are just one or two memory leaks in your script which are causing it to crash, then you should take the following steps:

  • Change memory_limit to something small, like 500KB
  • Comment out all but one of the processing steps which is applied to each row.
  • Run the limited processing over the whole CSV file and see if it can complete.
  • Gradually add more steps back in and watch to see if memory usage spikes.

Example:

ini_set('memory_limit', 1024 * 500);
$fp = fopen("test.csv", 'r');
while($row = fgetcsv($fp)) {
    validate_row($row);         // step 1: validate
    // add these back in one by one and keep an eye on memory usage
    //calculate_fizz($row);     // step 2: fizz
    //calculate_buzz($row);     // step 3: buzz
    //triangulate($row);        // step 4: triangulate
}
echo "Memory used: ", memory_get_peak_usage(), "\n";

The worst case scenario is that all of your processing steps are moderately inefficient and you will need to optimize all of them.

like image 70
too much php Avatar answered Oct 18 '22 14:10

too much php


It would help to have a look at the code but if you want to debug it yourself, have a look at Xdebug, it'll help profile your application.

Of course, depending on what you are doing, it is possible it's accumulating some memory, although 132MB seems already high for 2500 records. Of course, you can tweak your memory limit in php.ini if needed.

How big is the CSV file you are reading? And what objects and kind of processing are you doing to it?

like image 36
lpfavreau Avatar answered Oct 18 '22 14:10

lpfavreau