Sometimes when I'm sending over a large dataset to a Job, my queue worker exits abruptly.
// $taskmetas is an array with other arrays, each subsequent array having 90 properties.
$this->dispatch(new ProcessExcelData($excel_data, $taskmetas, $iteration, $storage_path));
The ProcessExcelData
job class creates an excel file using the box/spout package.
$taskmetas
has 880 rows - works fine
$taskmetas
has 10,000 rows - exits abruptly
1st example - queue output with a small dataset:
forge@user:~/myapp.com$ php artisan queue:work --tries=1
[2017-08-07 02:44:48] Processing: App\Jobs\ProcessExcelData
[2017-08-07 02:44:48] Processed: App\Jobs\ProcessExcelData
2nd example - queue output with a large dataset:
forge@user:~/myapp.com$ php artisan queue:work --tries=1
[2017-08-07 03:18:47] Processing: App\Jobs\ProcessExcelData
Killed
I don't get any error messages, logs are empty, and the job doesn't appear in the failed_jobs
table as with other errors. The time limit is set to 1 hour, and the memory limit to 2GBs.
Why are my queues abruptly quitting?
You can try with giving a timeout. For eg. php artisan queue:work --timeout=120
By default, the timeout is 60 seconds, so we forcefully override the timeout as mentioned above
Sometimes you work with resource-intensive processes like image converting or BIG excel file creating/parsing. And timeout option is not enough for this. You can set public $timeout = 0;
in your job but it still killed because of memory(!). By default memory limit is 128 MB. To fix it just add --memory=256
(or heigher) option to avoid this problem.
BTW:
The time limit is set to 1 hour, and the memory limit to 2GBs
This applying only for php-fpm in your case but not for queue process worker.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With