I want to minimize DB queries for an incoming request. It currently requires writes to 6 different tables. The processing does not need to be done before returning the response. I therefore consider laravel queues, but I wonder if I could also get rid of the separate query that is needed to write to the queue/jobs table. Can I store jobs locally instead of writing them to the DB?
One possible hack would be to have a separate route that I send the data to that does the processing. This way i would not need to write to the DB, but could rather just forward the data to that route without waiting for a response. Would this be faster than writing a job to the DB?
If you don't want to use native queue driver (that is using DB) I can advised you to use Rabbit-MQ. here is a good driver implementation for Laravel:
RabbitMQ Queue driver for Larave
You will have to run deamon service in the CLI using:
./artisan queue:work (deamon version - after you made each change in code you will have to call ./artisan queue:restart to refresh the code for queue)
or
./artisan queue:listen (for ./artisan queue:work --once process productor)
or use Supervisor with:
./artisan queue:work --once
(my personal favourite)
After that each task you will put to queue (with dispatch() or \Queue::push()) will be delivered to RabbitMQ server and will be exectuted through the queue driver.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With