I've seen that using map.clear()
and map.rehash(0)
doesn't release the expected memory from my RAM.
I created a program with the following code:
int main() {
std::unordered_map<std::string, int> m1;
std::unordered_map<std::string, int> m2;
// fill the maps m1 and m2 with 5 million elements
// bucket_count = 5098259; size = 5000000
// now I clear m1
m1.clear();
m1.rehash(0);
// m1: bucket_count = 2; size = 0
// m2: bucket_count = 5098259; size = 5000000
// now I clear m2
m2.clear();
m2.rehash(1);
// m1: bucket_count = 2; size = 0
// m2: bucket_count = 2; size = 0
}
According to the bucket_count
and size
after each step of the program, it seems that there is a lot of space being released from RAM. However, checking the RAM used with a system monitor, I get the following evolution
Even after clearing and rehashing both maps, the memory is only released when the program is finished. Actually there is a very little release of memory but I'm not sure if that's even possible to see in the picture. How is it possible?
How can I actually free that memory from the RAM?
I am using a Dynamic Programming algorithm to solve the Travelling Salesman Problem with 4Gb RAM in Linux. It crashes due to lack of memory, so I am trying to optimize the memory used by the program.
After some improvements I'm keeping two unordered_map
:
After an iteration ends, I swap them so costs has the new computed values, and I clear new_costs trying to free memory and start building the next iteration.
The code is something like this:
std::unordered_map<std::string, int> costs(5098259);
std::unordered_map<std::string, int> new_costs(5098259);
for (int m = 1; m <= n; m++) {
new_costs.clear();
new_costs.rehash(5098259);
while (something) {
// I build the content of new_costs based on the content of costs
}
std::swap(costs, new_costs);
}
At most I will have to store 2496144 and 2704156 elements in them respectively. That makes a total of 5200300 elements stored at the same time. Given that the key is a string with at most 70 chars --71 bytes-- and the value stored is a float --4 bytes--, I will have stored roughly 380 Mb.
I know that unordered_map
is not memory efficient, but the program uses my RAM and my swap memory completely, so I must be missing something.
How do I force my std::map to deallocate memory used?
I've been using C++ for a week or so, and I don't know whether the memory management of std::map
and std::unordered_map
are the same, or if there are relevant differences between them.
I also don't know if deallocate means exactly the same as releasing the memory used by an object, so there is more RAM available.
In case both are the same, I am basically asking the same question. However, in any case, the answers to that question are explaining the reason behind the memory retention, rather than providing solutions to it.
You have three levels of memory use to worry about, and your process memory footprint is probably showing you the wrong one.
in-use memory
Stuff in use in your container. Easy.
per-container allocator
Your container has an allocator. This may decide to allocate bigger chunks than requested, and may also keep released memory around for later re-use. Both of these generally improve speed at the cost of memory use.
You can probably release this cache using the swap trick: create an empty temporary container and std::swap
them. This moves the allocator as well, meaning any old cached storage will be released when the temporary goes out of scope.
For some value of released, anyway: the memory will likely be returned to your ...
process-wide dynamic allocator (often "heap").
This may decide to keep the memory cached in your process for exactly the same reason. It's under no obligation to return that memory to the system (until your process exits).
The only way to force your program to return memory to the system is to write your own allocator, using appropriate system calls to do this explicitly.
The sensible way to reason about your program's memory use, on a system with virtual memory, is to worry about how much memory is used and not worry too much about the size of its virtual address space.
NB. If memory use keeps growing despite you clearing and re-using containers, you either have a leak or are suffering from memory fragmentation. This really is a problem. When memory use just doesn't decrease during running, it's entirely normal and nothing to worry about.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With