Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Limiting Eloquent chunks

I have a very large result set to process and so I'm using the chunk() method to reduce the memory footprint of the job. However, I only want to process a certain number of total results to prevent the job from running too long.

Currently I'm doing this, but it does not seem like an elegant solution:

$count = 0;
$max = 1000000;
$lists = Lists::whereReady(true);

$lists->chunk(1000, function (Collection $lists) use (&$count, $max) {
    if ($count >= $max)
        return;

    foreach ($lists as $list) {
        if ($count >= $max)
            break;

        $count++;

        // ...do stuff
    }
});

Is there a cleaner way to do this?

like image 893
eComEvo Avatar asked Aug 19 '16 00:08

eComEvo


People also ask

How do I limit query results in Laravel?

Laravel eloquent or query builder limt method do the same as limit MySQL function. It accepts one parameter as count( number of count). Model::limit(10); //or \DB::table('table_name')->limit(10); In the above syntax we have used 1 examples to show possibilities to use limit function.

How does chunk works in Laravel?

The chunk() method is part of the query builder that fetches data from the database in smaller numbers/amounts. This is suitable when you have thousands of records your application is working with.

What is offset Laravel?

Posted On: Aug 22, 2022. Limit and offset in Laravel is used to paginate records or get the number of records from the table from an offset. In this example, you will see how to use/ set the limit and offset in Laravel Framework.


1 Answers

As of right now, I don't believe so.

There have been some issues and pull requests submitted to have chunk respect previously set skip/limits, but Taylor has closed them as expected behavior that chunk overwrites these.

There is currently an open issue in the laravel/internals repo where he said he'd take a look again, but I don't think it is high on the priority list. I doubt it is something he would work on, but may be more receptive to another pull request now.

Your solution looks fine, except for one thing. chunk() will end up reading your entire table, unless you return false from your closure. Currently, you are just returning null, so even though your "max" is set to 1000000, it will still read the entire table. If you return false from your closure when $count >= $max, chunk() will stop querying the database. It will cause chunk() to return false itself, but your example code doesn't care about the return of chunk() anyway, so that's okay.


Another option, assuming you're using sequential ids, would be to get the ending id and then add a where clause to your chunked query to get all the records with an id less than your max id. So, something like:

$max = 1000000;
$maxId = Lists::whereReady(true)->skip($max)->take(1)->value('id');

$lists = Lists::whereReady(true)->where('id', '<', $maxId);

$lists->chunk(1000, function (Collection $lists) {
    foreach ($lists as $list) {
        // ...do stuff
    }
});

Code is slightly cleaner, but it is still a hack, and requires one extra query (to get the max id).

like image 77
patricus Avatar answered Jan 01 '23 21:01

patricus