Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Laravel and running through a batch of results

I'm having a big head ache with Laravel's chunk and each functions for breaking up result sets.

I have a table, that has a column processed with a value of 0. If I run the following code, it goes through all 13002 records.

Record::where(['processed' => 0])->each(function ($record) { 
    Listing::saveRecord($record);
}, 500);

This code will run through all 13002 records. However, if I add in some code to mark a record as processed, things go horribly pear shaped.

Record::where(['processed' => 0])->each(function ($record) { 
    $listing_id = Listing::saveRecord($record);

    $record->listing_id = $listing_id;
    $record->processed = 1;
    $record->save();
}, 500);

When this code runs, only 6002 records are processed.

From my understand of things, that on each iteration of of the chunk (each runs through chunk), that it's executing a new statement.

I've come from using Yii2 and I'm mostly happy with the move, except for this hiccup, which has me pulling my hair out. Yii2 has similar functions (each and batch), but they seem to use result sets and pointers, so even if you update the table while you're processing your results, it doesn't effect your result set.

Is there actually a better way to do this in Laravel?

like image 249
SynackSA Avatar asked Mar 12 '23 01:03

SynackSA


1 Answers

Try this

 Records::where('processed',0)->chunk(100, function($records){
    foreach($records as $record)
    {
      //  do your stuff... 
    }
 });

https://laravel.com/docs/5.3/queries#chunking-results

Sorry about indentation, on my phone and that doesnt work apperently..

like image 156
René Juul Askjær Avatar answered Mar 16 '23 03:03

René Juul Askjær