I have a script that runs successfully when PHP's memory_limit is set to 50M
at about 25 seconds of runtime. When I print memory_get_peak_usage
at the end of the script, it lands pretty closely to 50M
. When I set the memory_limit higher, to 90M
, memory_get_peak_usage
shows around 75M
and the script loads about 10 seconds faster.
It seems intuitive that a script would use about the same amount of memory regardless of the memory_limit, but that doesn't seem to be the case. If a script maxes out at just under 50M with a limit of 50M
, I had expected the peak usage would be the same even though the memory_limit had been increased.
The only explanation I have is that PHP recognizes its close to its limit and spends time clearing unused memory in order to avoid hitting the limit. Is this how it actually works or have I just scratched the surface of something bigger?
What you see is the garbage collector doing its job.
When you redefine a non-primitive variable value, the old value is not immediately discarded from memory. It remains there, as part of your script's memory usage.
Only when your script get dangerously close to its memory limit, is when the garbage collector is called to cleanup those unused pieces of allocated memory in order to free more space for the script. This process is coastly, and thats why the script will run faster with more memory - the garbage collector is not needed so often.
Edit:
Buffering takes part in this aswell. If your script is writing large amount of data to files, this data is first queued in memory, as your hard disk will not be able to write this data as fast as you generate it. If you are generating data much faster than the disk can write it, eventually the available memory will fill up and your program will be forced to wait next time you try to fwrite()
or use any function that put data into buffers.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With