Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

My long running laravel 4 command keeps being killed

I have a laravel 4 web project that implements a laravel command.

When running in the development homestead vm, it runs to completion (about 40 seconds total time).

However when running it on the production server, it quits with a 'killed' output on the command line.

At first i thought it was the max_execution_time in cli php.ini, so I set it to 0 (for unlimited time).

How can I find out what is killing my command?

I run it on ssh terminal using the standard artisan invokation:

php artisan commandarea:commandname

Does laravel 4 have a command time limit somewhere?

The vps is a Ubuntu 4.10 machine with mysql, nginx and php-fpm

like image 350
DEzra Avatar asked Jan 14 '15 18:01

DEzra


People also ask

Which command is used to run Laravel?

The Laravel PHP artisan serve command helps running applications on the PHP development server. As a developer, you can use Laravel artisan serve to develop and test various functions within the application.

How kill artisan serve in php?

Press Ctrl + Shift + ESC. Locate the php process running artisan and kill it with right click -> kill process. Reopen the command-line and start back the server. Note that you should be able to kill the process just by sending it a kill signal with Ctrl + C.

Is there any CLI for Laravel?

Artisan is the name of the command-line interface included with Laravel. It provides a number of helpful commands for your use while developing your application.

What is closure based console commands in Laravel?

The Closure Commands (aka Closure Based Routes) are defined using the Artisan::command() in routes/console. php. • The command() accepts two arguments: The command signature and a Closure which receives the command's arguments and options.


3 Answers

So, firstly, thank you everyone who has pointed me in the right direction regarding PHP and laravel memory usage tracking.

I have answered my own question hoping that it will benefit laravel devs in the future, as my solution was hard to find.

After typing 'dmesg' to show system messages. I found that the php script was being killed by Linux.

So, I added memory logging calls into my script before and after each of the key areas of my script:

Log::Info('Memory now at: ' . memory_get_peak_usage());

Then I ran the script while watching the log output and also the output of the 'top' command.

I found that even though my methods were ending and the variables were going out of scope, the memory was not being freed.

Things that I tried, that DIDNT make any difference in my case:

  1. unset($varname) on variables after I have finished with them - hoping to get GC to kick off
  2. adding gc_enable() at beginning of script and then adding gc_collect_cycle() calls after a significant number of vars are unset.
  3. Disabling mysql transactions - thinking maybe that is memory intensive - it wasnt.

Now, the odd thing was that none of the above made any difference. My script was still using 150mb or ram by time it killed!

The solution that actually worked:

Now this is definitely a laravel specific solution. But my scripts purpose is basically to parse a large xml feed and then insert thousands of rows into mysql using Elequent ORM.

It turns out that Laravel creates logging information and objects to help you see the query performance.

By turning this off with the following 'magic' call, I got my script down from 150mb to around 20mb!

This is the 'magic;' call:

DB::connection()->disableQueryLog();

I can tell you by the time I found this call, I was grasping at straws ;-(

like image 106
DEzra Avatar answered Oct 20 '22 00:10

DEzra


A process may be killed for several reasons:

Out of Memory

There are two ways to trigger this error: Exceed the amount of memory allocated to PHP script in php.ini, or exceed the available system memory. Check the PHP error log and php.ini file to rule out the first possibility, and use dmesg output to check for the second possibility.

Exceeded the execution time-out limit

In your post you indicate that you disabled the timeout via the max_execution_time setting, but I have included it here for completeness. Be sure that the setting in php.ini is correct and (for those using a web server instead of a CLI script) restart the web server to ensure that the new configuration is active.

An error in the stack

If your script is error-free and not encountering either of the above errors, ensure that your system is running as expected. When using a web server, restart the web server software. Check the error logs for unexpected output, and stop or upgrade related daemons and needed.

like image 43
George Cummins Avatar answered Oct 20 '22 00:10

George Cummins


Had this issue on a Laravel/Spark project. just wanted to share if others have this issue.

Try a refresh/restart of your dev server if running Vagrant or Ubuntu before more aggressive approaches.

I accidentally ran install of dependency packages on a Vagrant server. I also removed and replaced a mirrored folder repeatedly during install errors. My error was on Laravel/Spark 4.~. I was able to run migrations on other projects; I was getting 'killed' very quickly, 300ms timeframe, on a particular project for nearly all commands. Reading other users, I was dreading trying to find the issue or corruption. In my case, a quick Vagrant reload did the trick. killed issue was resolved.

like image 29
Fujisan Avatar answered Oct 20 '22 00:10

Fujisan