Recently, I work on a video player program on Windows for a CCTV program. As the program has to decode and play many videos streams at the same time, I think it might meet the situation that malloc will fail and I add checking after every malloc.
But genrally speaking, in these code of open source programs that I've read in open source projects, I seldom find any checking of result of malloc. So when malloc fails, most program will just crash. Isn't that unacceptalbe?
My colleagues who write server programs on linux will alloc a enough memory for 100 client connections. So although his program might refuse the 101 client, it will never met a failure of malloc. Is his approach also suitable for desktop applications?
An "Out of Memory" error can occur when a Database Node Memory (KB) becomes less than 2 percent of the target size, and it cannot discard database pages on the node anymore to get free pages.
On Linux, malloc()
will never fail -- instead, the OOM killer will be triggered and begin killing random processes until the system falls over. Since Linux is the most popular UNIX derivative in use today, many developers have learned to just never check the result of malloc()
. That's probably why your colleagues ignore malloc()
failures.
On OSes which support failures, I've seen two general patterns:
Write a custom procedure which checks the result of malloc()
, and calls abort()
if allocation failed. For example, the GLib and GTK+ libraries use this approach.
Store a global list of "purge-able" allocations, such as caches, which can be cleared in the event of allocation failure. Then, try the allocation again, and if it still fails, report it via the standard error reporting mechanisms (which do not perform dynamic allocation).
Even on Linux, ulimit can be used to get a prompt malloc error return. It's just that it defaults to unlimited.
There is a definite pressure to conform to published standards. On most systems, in the long run, and eventually even on Linux, malloc(3)
will return a correct indication of failure. It is true that desktop systems have virtual memory and demand paging, but even then not checking malloc(3)
only works in a debugged program with no memory leaks. If anything goes wrong, someone will want to set a ulimit
and track it down. Suddenly, the malloc
check makes sense.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With