I am trying to produce a special build of a large monolithic application. The problem I am trying to solve is tracking hard-to-reproduce huge memory allocations (30-80 gigabytes, judging by what OS reports).
I believe the problem is an std::vector resized to a negative 32-bit integer value. The only platform exhibiting this behavior is Solaris (maybe it's the only platform that manages to successfully allocate such chunks of contiguous memory).
Can I globally replace std::vector with my class, delegating all calls to the real vector, watching for suspicious allocations (size > 0x7FFFFFFFu)
? Maybe selectively replace the constructor that takes size_t
and the resize()
methods? Maybe even hijacking the global operator new?
Why not to do something like this?
void *operator new(size_t size)
{
// if (size > MAX_SIZE) ...
return malloc(size);
}
void *operator new [](size_t size)
{
// if (size > MAX_SIZE) ...
return malloc(size);
}
Setting a breakpoint in the if
would find the problem right away.
You can provide a custom allocator on your vector at the time it's constructed.
You could just delegate to std::allocator
, and firewall the requested memory size, in the first instance.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With