I am querying a large data sets from the table and then iterating through a loop for creating a json file.
$user = App\User::all();
foreach($user as $val){
// logic goes here for creating the json file
}
Now the problem i am facing is that when iterating through the loop it is consuming memory and i am getting error 'Allowed memory size exhausted'.And also the cpu usage of the server becomng so high. My question how i should use the laravel lazy collections to get rid of this issue.I have gone through the offcial docs but couldnt find the way.
Just replace the all
method with the cursor
one.
$user = App\User::cursor();
foreach($user as $val){
// logic goes here for creating the json file
}
For more informations about the methods you can chain, refer to the official documentation
If you need to handle big/large/huge Collection/array/data with Laravel you have to use LazyCollection
class (documentation) with chunk()
method (documentation).
Using chunks is necessary because if your script will fails in a middle of your large array - script job will cancel everything with error, in case of chunks some of job will be handled. And LazyCollection
provides a low memory usage. So to get a maximum of profit the best way is to use the ones together.
<?php
use Illuminate\Support\LazyCollection;
//$bigArray - an array with large data
$chunkSize = 100;
$lazy = collect($bigArray)->lazy();
// or MyModel::lazy($chunkSize)->..., NOT MyModel::all()->lazy()
$lazy->chunk($chunkSize)
->each(function (LazyCollection $items) {
$items->each(function ($item) {
//do your job here with $item
})
});
This method does not guarantee a complete solution of the problem like 504 Gateway Time-out
or memory leaking. So try to increase memory_limit
and max_execution_time
with fastcgi_read_timeout
if you use Nginx web server. You can solve this error via Laravel and Nginx settings
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With