Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Does a memory leak at unload of a DLL cause a leak in the host process?

Consider this case:

dll = LoadDLL()
dll->do()

...
void do() {
    char *a = malloc(1024);
}
...

UnloadDLL(dll);

At this point, will the 1k allocated in the call to malloc() be available to the host process again? The DLL is statically linking to the CRT.

like image 976
Viktor Avatar asked Sep 25 '08 09:09

Viktor


People also ask

What happens to a host when a process has a memory leak?

A memory leak reduces the performance of the computer by reducing the amount of available memory. Eventually, in the worst case, too much of the available memory may become allocated and all or part of the system or device stops working correctly, the application fails, or the system slows down vastly due to thrashing.

What is the main cause of memory leaks?

DEFINITION A memory leak is the gradual deterioration of system performance that occurs over time as the result of the fragmentation of a computer's RAM due to poorly designed or programmed applications that fail to free up memory segments when they are no longer needed.

Can memory leaks cause the system to run out of memory?

Memory leaks block access to resources and cause an application to consume more memory over time, leading to degrading system performance. If memory leaks are not handled, an application can eventually run out of memory and terminate with an 'OutOfMemoryError', which is irrecoverable.

How can I tell which process is leaking memory?

You can run the top command (to run non-interactively, type top -b -n 1 ). To see applications which are leaking memory, look at the following columns: RPRVT - resident private address space size. RSHRD - resident shared address space size.


2 Answers

  1. Memory used by a process as tracked by the OS is applicable to the full process and not specific to a DLL.

  2. Memory is given to the program in chunks by the OS, called heaps

  3. The heap managers (malloc / new etc) further divide up the chunks and hands it out to requesting code.

  4. Only when a new heap is allocated does the OS detect an increase in memory.

  5. When a DLL is statically linked to the C Run time library (CRT), a private copy of CRT with the CRT functions that the DLL's code invokes is compiled and put into the DLL's binary. Malloc is also inclued in this.

  6. This private copy of malloc will be invoked whenever the code present inside the statically linked DLL tries to allocate memory.

  7. Consequently, a private heap visible only to this copy of malloc, is acquired from the OS by this malloc and it allocates the memory requested by the code within this private heap.

  8. When the DLL unloads, it unloads its private heap, and this leak goes unnoticed as the entire heap is returned back to the OS.

  9. However If the DLL is dynamically linked, the memory is allocated by a single shared version of malloc, global to all code that is linked in the shared mode.

  10. Memory allocated by this global malloc, comes out of a heap which is also the heap used for all other code that is linked in the dynamic aka shared mode and hence is common. Any leaks from this heap therefore becomes a leak which affects the whole process.

Edit - Added descriptions of the linking scenario.

like image 114
computinglife Avatar answered Dec 26 '22 16:12

computinglife


You can't tell. This depends on the implementation of your static and dynamic CRT. It may even depend on the size of the allocation, as there are CRTs that forward large allocations to the OS, but implement their own heap for small allocations.

The problem with a CRT that leaks is of course that it leaks. The problem with a CRT that does not leak is that the executable might reasonable expect to use the memory, as malloc'ed memory should remain usable until free is called.

like image 42
MSalters Avatar answered Dec 26 '22 15:12

MSalters