i got a problem in my Laravel Project, i'm trying to transcode a video file with FFMPEG about 450MB in size and due to this taking a long time i'm using Queues in Laravel to do this.
Due to the configuration of my production environment i have to use database queues the problem is that the queued job gets killed after about 60 seconds each time anytime i use the command php artisan queue:work
in my Vagrant box.
The Vagrant box has 4GB of Ram available, 2D and 3D acceleration enabled and the memory_peak_usage()
command never lists anything above 20MB during this whole process.
I checked the php_sapi_name()
and it's cli as expected so it shouldn't have any limits at all when it comes to execution time, regardless i went to the cli php.ini file and removed any limits again to be certain.
Tried rebooting Vagrant, getting Killed after a few seconds anyways.
So i decided to try creating a Laravel Command for the transcoding process, i hardcoded the filepaths and stuff and lo and behold it's working without being Killed...
Am i missing something about Queues? I'm just running php artisan queue:work
i'm not specifing a timeout of any sort, why is my queue getting killed?
Thank you in advance for your help.
The default timeout for jobs is 60 seconds, as you've found out. The timeout is specified with the --timeout[=TIMEOUT]
property, and disabling the timeout entirely is done with --timeout=0
.
php artisan queue:work --timeout=0
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With