I have a C/C++ program that might be hanging when it runs out of memory. We discovered this by running many copies at the same time. I want to debug the program without completely destroying performance on the development machine. Is there a way to limit the memory available so that a new or malloc will return a NULL pointer after, say, 500K of memory has been requested?
So what happens when your computer runs out of RAM? The easy answer to that question: Your computer will start to work less efficiently. That's because when your computer runs out of memory it will start to use the hard drive space for "virtual memory" to compensate.
When running an app on Mac, if all physical RAM are used, macOS will create swap files, which takes up disk space. So, it takes up more and more memory until your hard drive is full and there is no room for the swap files, the system will report that it has run out of application memory.
Try turning the question on its head and asking how to limit the amount of memory an OS will allow your process to use.
Try looking into http://ss64.com/bash/ulimit.html
Try say: ulimit -v
Here is another link that's a little old but gives a little more back ground: http://www.network-theory.co.uk/docs/gccintro/gccintro_77.html
One way is to write a wrapper around malloc().
static unsigned int requested =0; void* my_malloc(size_tamount){ if (requested + amount < LIMIT){ requested+=amount; return malloc(amount); } return NULL }
Your could use a #define to overload your malloc.
As GMan states, you could overload new / delete operators as well (for the C++ case).
Not sure if that's the best way, or what you are looking for
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With