I ve got a 16 MB size CSV file and try to parse it, and do some stuff, but script run out of memory after some time. I realize that this code generate around 200 MB of used space and unset is not working.
$countRows = 1;
var_dump("3. ".memory_get_usage()." beginDiff: ".(memory_get_usage() - $this->startingMemory));
while(($row = fgetcsv($fp, 300000, ';', '"')) !== FALSE)
{
if ($row == '')
continue;
if($firstRow == true)
{
foreach($row as $k => $v)
{
$this->columnMapping[$k] = trim(mb_strtolower($v));
}
$firstRow = false;
continue;
}else
{
foreach($row as $k => $v)
{
$row[$this->columnMapping[$k]] = $v;
unset($row[$k]);
}
}
...
//$this->theirCategoriesToProducts[$row['kategorie']][]['kodproduktu'] = $row['kodproduktu'];
$this->theirCategoriesToProducts[$row['kategorie']][] = $row;
}
var_dump("3,5. ".memory_get_usage()." beginDiff: ".(memory_get_usage() - $this->startingMemory));
...
var_dump("7. - before unset total: ".memory_get_usage()." beginDiff: ".(memory_get_usage() - $this->startingMemory));
unset($this->theirCategoriesToProducts);
var_dump("8. - after unset total: ".memory_get_usage()." beginDiff: ".(memory_get_usage() - $this->startingMemory));die;
Generating this output:
string '3. 72417440 beginDiff: 34730040' (length=31)
string '3,5. 292748528 beginDiff: 255061136' (length=36)
string '7. - before unset total: 299039360 beginDiff: 261351984' (length=55)
string '8. - after unset total: 297364432 beginDiff: 259677056' (length=54)
With setting that variable equals null is the output very similar. But switching comments between this two lines
$this->theirCategoriesToProducts[$row['kategorie']][]['kodproduktu'] = $row['kodproduktu'];
//$this->theirCategoriesToProducts[$row['kategorie']][] = $row;
Will output:
string '3. 72417784 beginDiff: 34730040' (length=31)
string '3,5. 81081984 beginDiff: 43394248' (length=34)
string '7. - before unset total: 87256544 beginDiff: 49568824' (length=53)
string '8. - after unset total: 85581520 beginDiff: 47893800' (length=52)
So its about 200 MB of "lost" memory (almost half of dedicated).
Recursive function unseting all parts of arrays eat more memory, than was able to free so crashed also.
In script is never used that array with & so there should be no references to other variables.
File is closing right after 3.5 dump.
Any other ideas, how to unset that array?
null variable immediately frees the memory. CPU cycles are wasted and it takes longer execution time. It speedily frees the memory.
The unset function is used to destroy any other variable and same way use to delete any element of an array. This unset command takes the array key as input and removed that element from the array. After removal the associated key and value does not change.
PHP memory management functions are invoked by the MySQL Native Driver through a lightweight wrapper. Among others, the wrapper makes debugging easier. The various MySQL Server and the various client APIs differentiate between buffered and unbuffered result sets.
The function unset() destroys the specified variables.
As of PHP > 5.3 there are some Garbage Collection mechanisms available, so theoretically you could think of something like the example in the docs
//Memory cleanup for long-running scripts.
gc_enable(); // Enable Garbage Collector
var_dump(gc_enabled()); // true
var_dump(gc_collect_cycles()); // # of elements cleaned up
gc_disable(); // Disable Garbage Collector
But unfortunately, in your case you have to bear in mind that (according to Can I trigger PHP garbage collection to happen automatically if I have circular references?) the garbage collector "will not run, for example, when the memory limit is about to hit. As a result, your script can still abort when hitting memory limit only because PHP is too dumb to collect the cycles in that case!".
In the end, you can try using GC, but it possibly won't solve your problem.
So, what else is there to try? Try splitting your master data array that you import into smaller chunks and import them sequentially one after another. Fetch the chunks in a loop always into the same variable and then loop through it to process the records.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With