I have an application for which I am performing memory usage analysis. I am loading the application with some data and the application is such that it caches(stores in form of some hashtables and other data structure, lets say some records) information out of this pumped data. For each record stored in memory, the application allocates memory using malloc/calloc. After certain time, around 80% of these records time out and application frees the memory it had allocated for these records. To keep a check on the memory usage of the application, in the background I ran a script to capture the output of top as well as "free -m" and plotted a graph to see the memory usage of the system as follows. The graph shows the trend seen in the values printed by "free -m" under the row "-/+ buffers/cache" with columns used and free. I was expecting the graph of used to be increasing in the beginning and then decrease as the application frees the memory. But this is not the actual scenario. The RES in output of top also does not decrease. Can some one help me understand the memory dynamics for an application as seen under Linux.
malloc is a library call. If it has no memory in its own memory pool, it will ask the kernel to associate more memory to the process.
malloc for small buffers will use memory from its own pool and free will return the memory to this pool, without actually releasing it to the system, unless the pool becomes too large.
malloc for large buffers is implemented in a completely different way, will call the kernel and will hence be slower, but when freeing such a buffer, it will be released immediately.
http://linux.die.net/man/3/malloc
Read the notes there.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With