Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Prevent PHP script using up all resources while it runs?

Tags:

php

I have a daily cron job which takes about 5 minutes to run (it does some data gathering and then various database updates). It works fine, but the problem is that, during those 5 minutes, the site is completely unresponsive to any requests, HTTP or otherwise.

It would appear that the cron job script takes up all the resources while it runs. I couldn't find anything in the PHP docs to help me out here - how can I make the script know to only use up, say, 50% of available resources? I'd much rather have it run for 10 minutes and have the site available to users during that time, than have it run for 5 minutes and have user complaints about downtime every single day.

I'm sure I could come up with a way to configure the server itself to make this happen, but I would much prefer if there was a built-in approach in PHP to resolving this issue. Is there?

Alternatively, as plan B, we could redirect all user requests to a static downtime page while the script is running (as opposed to what's happening now, which is the page loading indefinitely or eventually timing out).

like image 331
sveti petar Avatar asked Mar 27 '18 11:03

sveti petar


2 Answers

A normal script can't hog up 100% of resources, resources get split over the processes. It could slow everything down intensly, but not lock all resources in (without doing some funky stuff). You could get a hint by doing top -s in your commandline, see which process takes up a lot.

That leads to conclude that something locks all further processes. As Arkascha comments, there is a fair chance that your database gets locked. This answer explains which table type you should use; If you do not have it set to InnoDB, you probally want that, at least for the locking tables.

It could also be disk I/O if you write huge files, try to split it into smaller read/writes or try to place some of the info (e.g. if it are files with lists) to your database (assuming that has room to spare).

It could also be CPU. To fix that, you need to make your code more efficient. Recheck your code, see if you do heavy operations and try to make those smaller. Normally you want this as fast as possible, now you want them as lightweight as possible, this changes the way you write code.

If it still locks up, it's time to debug. Turn off a large part of your code and check if the locking still happens. Continue turning on code untill you notice locking. Then fix that. Try to figure out what is costing you so much. Only a few scripts require intense resources, it is now time to optimize. One option might be splitting it into two (or more) steps. Run a cron that prepares/sanites the data, and one that processed the data. These dont have to run at syncronical, there might be a few minutes between them.

If that is not an option, benchmark your code and improve as much as you can. If you have a heavy query, it might improve by selecting only ID's in the heavy query and use a second query just to fetch the data. If you can, use your database to filter, sort and manage data, don't do that in PHP.
What I have also implemented once is a sleep every N actions.

If your script really is that extreme, another solution could be moving it to a time when little/no visitors are on your site. Even if you remove the bottleneck, nobody likes a slow website.

And there is always the option of increasing your hardware.

like image 90
Martijn Avatar answered Oct 18 '22 08:10

Martijn


You don't mention which resources are your bottleneck; CPU, memory or disk I/O.

However if it is CPU or memory you can do something this in you script: http://php.net/manual/en/function.sys-getloadavg.php http://php.net/manual/en/function.memory-get-usage.php

$yourlimit = 100000000; 
$load = sys_getloadavg();
if ($load[0] > 0.80 || memory_get_usage() > $yourlimit) {
    sleep(5);
}

Another thing to try would be to set your process priority in your script. This requires SU though, which should be fine for a cronjob? http://php.net/manual/en/function.proc-nice.php

 proc_nice(50);

I did a quick test for both and it work like a charm, thanks for asking I have cronjob like that as well and will implement it. It looks like the proc_nice only will do fine.

My test code:

proc_nice(50);
$yourlimit = 100000000;
while (1) {
    $x = $x+1;
    $load = sys_getloadavg();
    if ($load[0] > 0.80 || memory_get_usage() > $yourlimit) {
        sleep(5);
    }
    echo $x."\n";
}
like image 25
PaulV Avatar answered Oct 18 '22 08:10

PaulV