Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Laravel chunking not reducing PHP memory usage

So I've been trying my hands on laravel's chunking in Eloquent but I've run into a problem. Consider the following code (a much more simplified version of my problem):

$data = DB::connection('mydb')->table('bigdata')
->chunk(200, function($data) {
  echo memory_get_usage();
  foreach($data as $d) {
    Model::create(
      array(
        'foo' => $d->bar,
        ...
        //etc
      ));
  }
}

So when I run the following code my memory outputs look like this:

19039816
21490096
23898816
26267640
28670432
31038840

So without jumping into php.ini and changing the memory_limit value any clue why it isn't working? According to the documentation: "If you need to process a lot (thousands) of Eloquent records, using the chunk command will allow you to do without eating all of your RAM".

I tried unset($data) after the foreach function but it did not help. Any clue as to how I can make use of chunk or did I misinterpret what it does?

like image 363
tiffanyhwang Avatar asked Jan 14 '14 11:01

tiffanyhwang


2 Answers

Chunking data doesn't reduce memory usage, you need to do it like pagination directly using the database.

Like first fetch starting 200 order by id or something, and after processing first 200, fire that query again with a where clause asking next 200 results.

like image 94
Rohit Khatri Avatar answered Sep 30 '22 16:09

Rohit Khatri


You can use lazy collections to improve memory uses for a big collection of data. It uses PHP generators under the hood. Take a look at the cursor example here https://laravel.com/docs/5.4/eloquent#chunking-results

like image 45
Kamlesh Suthar Avatar answered Sep 30 '22 16:09

Kamlesh Suthar