Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Bad_alloc not thrown when I expect it to

Consider this simple program:

#include <exception>
#include <iostream>

int main(void)
{
    const std::size_t size = 1<<31;
    int *a = NULL;

    try
    {
        a = new int[size];
    }
    catch (std::exception &e)
    {
        std::cerr << "caught some bad guy" << std::endl;
        return 1;
    }

    if (a == NULL)
    {
        std::cerr << "it's null, can't touch this" << std::endl;
        return 1;
    }

    std::cerr << "looks like 'a' is allocated alright!" << std::endl;

    for (size_t i = 0; i < size; i ++)
        std::cout << a[i] << " ";

    return 0;
}

Commentary

  • I try to allocate some ridiculous amount of memory: (1<<31) * sizeof(int) == 8GB
  • I add safety checks
    • Catching std::exception, which should catch std::bad_alloc among other exceptions...
    • Check if it's not null (even though for this check to actually make sense, I'd need a = new (std::nothrow) int[size] - but regardless of how I allocate memory, it doesn't work)

Environment

  • RAM installed: 2GB
  • Operating system: Debian
  • Architecture: 32-bit

Problem

The problem is that the program, instead of early exit, does something like this:

rr-@burza:~$ g++ test.cpp -o test && ./test
looks like 'a' is allocated alright!
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
(...many other zeros here...)
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0Segmentation fault

The number of zeros printed is exactly 33790, which tells me exactly... nothing. How can I make my program segfault-proof?

like image 438
rr- Avatar asked Nov 02 '22 03:11

rr-


1 Answers

This seems to be a bug in your environment, which causes integer overflow in implementation of new[]. In effect, you are allocating 0 bytes. It might be this bug. C++03 standard is not clear about what should happen, in C++11 std::bad_array_new_length should be thrown.

If you need to support this system you can check if there is chance for overflow before allocating, for example:

size_t size_t_max = -1;
if (size > size_t_max / sizeof(int))
    throw ...;

This bug might still affect you however if libraries you use don't have such checks (for example implementation of std::vector).

like image 111
zch Avatar answered Nov 10 '22 01:11

zch