Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Should I free memory for a short lived program if memory footprint is not an issue? [closed]

I have a program that does the following

  1. Allocate memory (heap)
  2. Do some processing
  3. Allocate more memory (heap)
  4. Do some processing

It does so a few times then exit.

I don't really care about the memory footprint of the program, only the execution time.

Would it be a bad thing not to free memory on the account that it might actually takes more time to process the free/delete than just skip it and move to next step.

In the end, the program will exit and it won't matter anymore.

I understand that the only way to be sure is to do some benchmarks on my computer, but I'm interested in the theoretical pros and cons.

NB: let's assume a modern OS that will cleanup the memory at exit.

like image 625
Antzi Avatar asked Mar 11 '23 01:03

Antzi


1 Answers

There are a number of potential problems. Examples include;

  • If you can't predict in advance how much memory is actually needed - which is among the most common reasons to use dynamic memory allocation - then your program may exhaust available memory (either due to exhausting system memory, or because the host operating system impose quotas on your program). After that, it may or may not run as fast as required, but (even ignoring timing concerns) will probably run incorrectly and produce incorrect results. It doesn't matter how much memory your host system has, or what quota the host system enforces for program it hosts - it is possible to exhaust that amount.
  • Not all operating systems release memory as a program exits. And, among those that do, there is potential that the memory is not fully released - both due to bugs in the OS itself, and due to actions by your program (e.g. allocating resources that are shared with other programs). In such cases, if your program is run several times, you may find that the program (when run for the 32nd time [to pick a random number] or so) will inexplicably fail.
  • As a program allocates more memory then, depending on how dynamic memory allocation is managed (e.g. data structures used by malloc()), allocations themselves can slow down if memory is not released. That can cause your program not to meet timing constraints as it allocates more memory. Releasing memory when no longer needed can alleviate such concerns (albeit with other effects, such as memory fragmentation).
  • If you get into the habit of not releasing dynamically allocated memory, you may well also (for similar reasons of "efficiency") not bother to check if allocations succeed - after all, that takes time too. And that causes problems should allocations ever fail (e.g. abnormal program terminations, trashing memory, producing wrong results without warning, etc).

The bottom line is that allocating memory and not deallocating it is a very poor (and lazy) strategy if you care at all about program performance or timing. If you really care about program performance/timing, you will not actually dynamically allocate memory at all.

If you are using dynamic memory allocation, then you are better off releasing it when no longer needed, even if you don't care about memory footprint. Depending on circumstances, you may find the program runs either faster or slower if you release memory properly (it depends on numerous variables, including those I'm mentioned above, and more). And, should you ever need to reuse your code in a larger program - which, practically, happens more often than not in the real world - you are more likely to run into problems (memory concerns, performance concerns) if your code does not release memory properly.

like image 126
Peter Avatar answered Apr 07 '23 01:04

Peter