I'm running a BIG PHP script, it might take a full day to finish its job,
this script grabs data from MySQL database and use it with curl to test stuff.. it does it with about 40,000 records..
So to make it run on background for as long as it needs, i used the terminal to execute it.. in the PHP script itself it has those settings to make sure it runs as long as possible until it finishes :
set_time_limit(0); // run without timeout limit
and because i execute it from another separated PHP script i use this function
ignore_user_abort(1); // ignore my abort
because executing it directly from the command line, it will give me two choices..
1 ) to wait till the script finishes
2 ) cancel the whole process
and after searching, there was an article that gives me a third choice and its to run it in background for longest possible by creating an external PHP script to excute the main BIG PHP script in background using this function:
exec("php bigfile.php");
that means i can open this external page normally from a browser and exit it without worry since ignore_user_abort
will keep it running in background.. That's still not the problem
the problem is.. after an unknown period, the script stops its job.. how do i know ? I told it to write in an external file the current date time on each record it works on, so i refresh everytime to that external page to see if it stopped updating,
after an unknown period it actually stops for no reason, the script has nothing that says stop or anything.. and if anything wrong happened, i told it to skip the record ( nothing wrong happens tho, they all work in the same line, if one works then all should work )
However my main doubts are in the following :
while
? Would it crash the whole script if an error occurred ? Does it has any timeout at anything that crashes the whole script ? If none of those could apply to it, Is there any way to log what is exactly going on the script now ? Or why would it crash ? any detailed way to log everything ?
I FOUND this in messages
file in /var/log/
:
Dec 29 16:29:56 i0sa shutdown[5609]: shutting down for system halt
Dec 29 16:30:14 i0sa exiting on signal 15
Dec 29 16:30:28 i0sa syslogd 1.5.0#6: restart.
Dec 29 16:50:28 i0sa -- MARK --
.....
Dec 29 18:50:31 i0sa -- MARK --
Dec 29 19:02:36 i0sa shutdown[3641]: shutting down for system halt
Dec 29 19:03:11 i0sa exiting on signal 15
Dec 29 19:03:48 i0sa syslogd 1.5.0#6: restart.
it says for system halt.. I'll try to make sure that this could be it in the future crashes and match times, COULD this be causing it ? and why ? memory_limit
is 128M while i have 2GB
of server memory ram, could this be it ?
P.S.: I restarted the server several times manually.. But this one says shutdown and halt ?
So possible reasons could be: PHP is not installed properly on your system or the server is not properly installed. PHP module isn't loaded in your apache. You did not put your scripts in the right place.
The exit() function in PHP is an inbuilt function which is used to output a message and terminate the current script. The exit() function only terminates the execution of the script.
Check if a PHP script is already running If you have long running batch processes with PHP that are run by cron and you want to ensure there's only ever one running copy of the script, you can use the functions getmypid() and posix_kill() to check to see if you already have a copy of the process running.
For such cases i use with success nohup command like this:
nohup php /home/cron.php >/dev/null 2>&1 &
You can check after that if script is running with:
jobs -l
Note: When you use nohup command path for php file must to be absolute not relative. I think is not very graceful to call from one php file another php file only to prevent that execution to stop before finish work.
External reference: http://en.wikipedia.org/wiki/Nohup
Also make sure that you do not have memory leaks in your script, that make script after some time to crash because "out of memory".
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With