Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does my Delphi program's memory continue to grow?

I am using Delphi 2009 which has the FastMM4 memory manager built into it.

My program reads in and processes a large dataset. All memory is freed correctly whenever I clear the dataset or exit the program. It has no memory leaks at all.

Using the CurrentMemoryUsage routine given in spenwarr's answer to: How to get the Memory Used by a Delphi Program, I have displayed the memory used by FastMM4 during processing.

What seems to be happening is that memory is use is growing after every process and release cycle. e.g.:

1,456 KB used after starting my program with no dataset.

218,455 KB used after loading a large dataset.

71,994 KB after clearing the dataset completely. If I exit at this point (or any point in my example), no memory leaks are reported.

271,905 KB used after loading the same dataset again.

125,443 KB after clearing the dataset completely.

325,519 KB used after loading the same dataset again.

179,059 KB after clearing the dataset completely.

378,752 KB used after loading the same dataset again.

It seems that my program's memory use is growing by about 53,400 KB upon each load/clear cycle. Task Manager confirms that this is actually happening.

I have heard that FastMM4 does not always release all of the program's memory back to the Operating system when objects are freed so that it can keep some memory around when it needs more. But this continual growing bothers me. Since no memory leaks are reported, I can't identify a problem.

Does anyone know why this is happening, if it is bad, and if there is anything I can or should do about it?


Thank you dthorpe and Mason for your answers. You got me thinking and trying things that made me realize I was missing something. So detailed debugging was required.

As it turns out, all my structures were being properly freed upon exit. But the memory release after each cycle during the run was not. It was accumulating memory blocks that would normally have caused a leak that would have been detectable on exit if my exit cleanup was not correct - but it was.

There were some StringLists and other structures I needed to clear between the cycles. I'm still not sure how my program worked correctly with the extra data still there from the earlier cycles but it did. I'll probably research that further.

This question has been answered. Thanks for your help.

like image 956
lkessler Avatar asked Apr 02 '10 23:04

lkessler


1 Answers

The CurrentMemoryUsage utility you linked to reports your application's working set size. Working set is the total number of pages of virtual memory address space that are mapped to physical memory addresses. However, some or many of those pages may have very little actual data stored in them. The working set is thus the "upper bound" of how much memory your process is using. It indicates how much address space is reserved for use, but it does not indicate how much is actually committed (actually residing in physical memory) or how much of the pages that are committed are actually in use by your application.

Try this: after you see your working set size creep up after several test runs, minimize your application's main window. You will most likely see the working set size drop significantly. Why? Because Windows performs a SetProcessWorkingSetSize(-1) call when you minimize an application which discards unused pages and shrinks the working set to the minimum. The OS doesn't do this while the app window is normal sized because reducing the working set size too often can make performance worse by forcing data to be reloaded from the swap file.

To get into it in more detail: Your Delphi application allocates memory in fairly small chunks - a string here, a class there. The average memory allocation for a program is typically less than a few hundred bytes. It's difficult to manage small allocations like this efficiently on a system-wide scale, so the operating system doesn't. It manages large memory blocks efficiently, particularly at the 4k virtual memory page size and 64k virtual memory address range minimum sizes.

This presents a problem for applications: applications typically allocate small chunks, but the OS doles out memory in rather large chunks. What to do? Answer: suballocate.

The Delphi runtime library's memory manager and the FastMM replacement memory manager (and the runtime libraries of just about every other language or toolset on the planet) both exist to do one thing: carve up big memory blocks from the OS into smaller blocks used by the application. Keeping track of where all the little blocks are, how big they are, and whether they've been "leaked" requires some memory as well - called overhead.

In situations of heavy memory allocation/deallocation, there can be situations in which you deallocate 99% of what you allocated, but the process's working set size only shrinks by, say, 50%. Why? Most often, this is caused by heap fragmentation: one small block of memory is still in use in one of the large blocks that the Delphi memory manager obtained from the OS and divvied up internally. The internal count of memory used is small (300 bytes, say) but since it's preventing the heap manager from releasing the big block that it's in back to the OS, the working set contribution of that little 300 byte chunk is more like 4k (or 64k depending on whether it's virtual pages or virtual address space - I can't recall).

In a heavy memory intensive operation involving megabytes of small memory allocations, heap fragmentation is very common - particularly if memory allocations for things not related to the memory intensive operation are going on at the same time as the big job. For example, if crunching through your 80MB database operation also outputs status to a listbox as it progresses, the strings used to report status will be scattered in the heap amongst the database memory blocks. When you release all the memory blocks used by the database computation, the listbox strings are still out there (in use, not lost) but they are scattered all over the place, potentially occupying an entire OS big block for each little string.

Try the minimize window trick to see if that reduces your working set. If it does, you can discount the apparent "severity" of the numbers returned by the working set counter. You could also add a call to SetProcessWorkingSetSize after your big compute operation to purge the pages that are no longer in use.

like image 90
dthorpe Avatar answered Sep 22 '22 06:09

dthorpe