I'm running the following code over a set of 5,000 results. It's failing due to the memory being exhausted.
foreach ($data as $key => $report) {
$data[$key]['data'] = unserialize($report['serialized_values']);
}
I know I can up the memory limit, but I'd like to run this without a problem instead. I'm not going to be able to keep upping the memory forever.
EDIT
The $data
is in this format:
[1] => Array
(
[0] => 127654619178790249
[report_id] => 127654619178790249
[1] => 1
[user_id] => 1
[2] => 2010-12-31 19:43:24
[sent_on] => 2010-12-31 19:43:24
[3] =>
[fax_trans_id] =>
[4] => 1234567890
[fax_to_nums] => 1234567890
[5] => ' long html string here',
[html_content] => 'long html string here',
[6] => 'serialization_string_here',
[serialized_values] => 'serialization_string_here',
[7] => 70
[id] => 70
)
Beyond the problems of for and foreach, you need to re-architect your solution. You're hitting memory limits because you're legitimately using too much memory. Each time you unserialize the contents of the database column and store it in an array
$data[$key]['data']
PHP needs to set aside a chunk of memory to store that data so it can be accessed later. When your array gets too big you're out of memory. In plain english, you're telling PHP
Take all 5000 rows of data and store them in memory, I'm going to do something with them later.
You need to think of a different way to approach your problem. The below items are two quick thoughts on the problem.
You could not store the items in memory and just take whatever actions you wanted to in the loop, allowing php to discard the items as need be
foreach ($data as $key => $report) {
$object = unserialize($report['serialized_values']);
//do stuff with $object here
}
You could also only store the information you need from the unserialized object, rather than storing the entire object
foreach ($data as $key => $report) {
$object = unserialize($report['serialized_values']);
$data = array();
$data['foo'] = $object->foo;
$data[$key]['data'] = $data;
}
Long story short: you're hitting memory limits because you're actually using too much memory. There's no magic solution here. Storing serialized data and attempting to load it all in a single program is a memory intensive approach, irrespective of language/platform.
A foreach
will load all 5,000 results into memory. See the numerous complaints in the docs. Use a for
loop and access each result as you need it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With