Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Maximum execution time error in PHP script, is this a good solution to bypass it?

I'm the only administrator of the database. I want to run a one time script that basically takes around 3800 scanned images (it will grow up to 10 000 thousand) and create a couple of thumbnails for each image using the PHP exec() function to execute the external program imagemagick to create those thumbnails.

I've created the script, launched it and all works perfectly! All is done on my local development server. The script takes around 11 minutes to create thousands of thumbnails. It's a one time operation that is run every other year, so the consequences are minimal.

So far so good. Here's when i running into problems.

Everything that i did on my local development server I did on the live server for testing purposes. I have a shared host account with hostgator. Running my 11 minutes long script on a shared host gives me the error 'Maximum execution time of 30 seconds exceeded...'. I did my research, tried many of the solutions found in this post (Increase max execution time for php) just to realize there is nothing i can do to change the maximum execution time of a script on a shared host.

I'm stuck. So, my question is what is the obvious solution here.

I was thinking of launching the script for every 200 images, refresh the page automatically and run the script again for the next 200 images and so on until there's no more images. This way i'm sure the 30 seconds maximum execution time allowed on my shared host is respected. It looks like a solution right off the top of my head, but i'm not sure if this is a NO NO, if i'm going to run into bigger problems, too many negatives..

Is this the obvious solution? Anyone run into the same problem? What did you guys suggest?

Thanks

like image 459
Marco Avatar asked May 10 '16 20:05

Marco


1 Answers

Assuming you do have a reason to recreate the thumbnails in batch, instead of doing it at each image upload as was suggested, I'd do exactly as you did - use a script that refreshes itself - except that I wouldn't set a fixed number of images.

Rather I would have the script time itself after each image, and stop when it has reached, say, 25 seconds:

$stop = time() + 25;
while (time() < $stop) {
    ...find image to process, process it.
    if (finished()) {
        die("OK");
    }
}
// Redirect to next batch of images
die(Header('Location: ....'));

However, do check with your ISP, because your script might be either seen as an abuse of the service, or it could be mistaken for an attack. Also, enquire whether there's a preferred time of day to run this kind of maintenance.

Another, naughtier way of doing the same thing is to have the script run for a very small number of images (possibly a single one) every time someone hits the home page. This has the effect of having the extra load from the script mimic the real load on the server, avoiding embarrassing spikes or mysterious nonzero base loads. You do need to find a way of never choosing the same image from different instances of the script running in parallel (when I had to do this, I set a flag in a database).

like image 101
LSerni Avatar answered Nov 10 '22 10:11

LSerni