After searching stackoverflow for an answer, I decided that the only problems that were treated was those regarding the extension of the maximum allowed memory. My question is if anyone knows how to insert a proper brake in case the scripts might fail due to the exhausting the memory.
This as for point A, also this in very close relation to another problem for which I failed to find an answer, meaning point B, and that is the time limit: another set of answers on how to extend the time limit using ini_set, and so on, possible solution were found by manually create break points in case the script already over-exceeded a proper time. And that was very fine, but having no control over a script you are loading might be impossible to prevent it.
Now, back to the question, does anyone know how to properly prevent such errors with a safe method? And if there really was an answered question regarding this subject, please link me to it.
Best regards,
The usual approach to reducing the amount of memory that a program might use is to break the problem into smaller pieces, iteratively refine answers, use files on disk in place of memory for structures, etc. Without knowing details of your application, it's impossible to really suggest the best approach
But consider the task of sorting a 100 megabyte input file in perhaps 100kb of memory space; you might think the task impossible without actually holding all 100 megabytes in memory at once. However, you can easily split this problem into smaller pieces in many different approaches:
perhaps sort all lines starting with integers 0-1000 into one file, 1001-2000 into another file, 2001-3000 into a third file, etc. Once all the lines are segregated into different files, sort each file, and then merge the input files with a simple concatenation.
perhaps sort the first 1000 lines at a time, store them into a temporary file, the next 1000 lines into another temporary file, and so on; then merge the output by selecting the lowest output from each file, outputting it, removing it from its file, and iterating.
Consider the task of image editing: rather than loading an entire image into memory, you might read into memory only the portions you're going to work on, do the operation, and immediately re-write the contents back to disk.
Or, if your problem is a large regular expression match, you might have a significant backtracking problem where too much state is being stored at once. You could convert to different matching operators to prevent backtracking and perhaps solve the problem with multiple matches.
It depends on the problem you've got to solve how you can break it into smaller pieces to reduce overall memory use. Note that many of these techniques can increase execution time, because they require more disk IO or re-solving problems that could have been kept in memory. Sometimes they speed up execution time because the less data you work with or keep, the better locality of reference, and the better the caching can work. (This is rare. See Programming Pearls for a very striking case of speed up due to simpler problem solving...)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With