Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Laravel Database Queue, "Killed" after a few seconds

i got a problem in my Laravel Project, i'm trying to transcode a video file with FFMPEG about 450MB in size and due to this taking a long time i'm using Queues in Laravel to do this.

Due to the configuration of my production environment i have to use database queues the problem is that the queued job gets killed after about 60 seconds each time anytime i use the command php artisan queue:work in my Vagrant box.

The Vagrant box has 4GB of Ram available, 2D and 3D acceleration enabled and the memory_peak_usage() command never lists anything above 20MB during this whole process.

I checked the php_sapi_name() and it's cli as expected so it shouldn't have any limits at all when it comes to execution time, regardless i went to the cli php.ini file and removed any limits again to be certain.

Tried rebooting Vagrant, getting Killed after a few seconds anyways.

So i decided to try creating a Laravel Command for the transcoding process, i hardcoded the filepaths and stuff and lo and behold it's working without being Killed...

Am i missing something about Queues? I'm just running php artisan queue:work i'm not specifing a timeout of any sort, why is my queue getting killed?

Thank you in advance for your help.

like image 909
João Serra Avatar asked Aug 08 '17 10:08

João Serra


1 Answers

The default timeout for jobs is 60 seconds, as you've found out. The timeout is specified with the --timeout[=TIMEOUT] property, and disabling the timeout entirely is done with --timeout=0.

php artisan queue:work --timeout=0
like image 188
sisve Avatar answered Oct 23 '22 12:10

sisve