Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Allocating large blocks of memory with new

I have the need to allocate large blocks of memory with new.

I am stuck with using new because I am writing a mock for the producer side of a two part application. The actual producer code is allocating these large blocks and my code has responsibility to delete them (after processing them).

Is there a way I can ensure my application is capable of allocating such a large amount of memory from the heap? Can I set the heap to a larger size?

My case is 64 blocks of 288000 bytes. Sometimes I am getting 12 to allocate and other times I am getting 27 to allocate. I am getting a std::bad_alloc exception.

This is: C++, GCC on Linux (32bit).

like image 981
JeffV Avatar asked Feb 26 '09 12:02

JeffV


People also ask

How are memory blocks allocated?

Blocks of memory allocated on the heap are actually a special type of data structure consisting of (1) A pointer to the end of the previous block, (2) a pointer to the end of this block, (3) the allocated block of memory which can vary in size depending on its use, (4) a pointer to the beginning of this block, and (5) ...

How multiple blocks of memory are allocated?

Malloc() function is used to allocate a single block of memory space while the calloc() in C is used to allocate multiple blocks of memory space. Each block allocated by the calloc() function is of the same size.

How do you allocate a block of memory in C++?

Use the malloc() function to allocate memory in designated blocks and the new function to create memory in the free store (heap). To reallocate memory, the realloc() function is used. When finished, always include a free() function in order to free up the memory. If you used new(), use delete() to free up the memory.


3 Answers

With respect to new in C++/GCC/Linux(32bit)...

It's been a while, and it's implementation dependent, but I believe new will, behind the scenes, invoke malloc(). Malloc(), unless you ask for something exceeding the address space of the process, or outside of specified (ulimit/getrusage) limits, won't fail. Even when your system doesn't have enough RAM+SWAP. For example: malloc(1gig) on a system with 256Meg of RAM + 0 SWAP will, I believe, succeed.

However, when you go use that memory, the kernel supplies the pages through a lazy-allocation mechanism. At that point, when you first read or write to that memory, if the kernel cannot allocate memory pages to your process, it kills your process.

This can be a problem on a shared computer, when your colleague has a slow core leak. Especially when he starts knocking out system processes.

So the fact that you are seeing std::bad_alloc exceptions is "interesting".

Now new will run the constructor on the allocated memory, touching all those memory pages before it returns. Depending on implementation, it might be trapping the out-of-memory signal.

Have you tried this with plain o'l malloc?

Have you tried running the "free" program? Do you have enough memory available?

As others have suggested, have you checked limit/ulimit/getrusage() for hard & soft constraints?

What does your code look like, exactly? I'm guessing new ClassFoo [ N ]. Or perhaps new char [ N ].

What is sizeof(ClassFoo)? What is N?

Allocating 64*288000 (17.58Meg) should be trivial for most modern machines... Are you running on an embedded system or something otherwise special?

Alternatively, are you linking with a custom new allocator? Does your class have its own new allocator?

Does your data structure (class) allocate other objects as part of its constructor?

Has someone tampered with your libraries? Do you have multiple compilers installed? Are you using the wrong include or library paths?

Are you linking against stale object files? Do you simply need to recompile your all your source files?

Can you create a trivial test program? Just a couple lines of code that reproduces the bug? Or is your problem elsewhere, and only showing up here?

--

For what it's worth, I've allocated over 2gig data blocks with new in 32bit linux under g++. Your problem lies elsewhere.

like image 155
Mr.Ree Avatar answered Nov 11 '22 01:11

Mr.Ree


It's possible that you are being limited by the process' ulimit; run ulimit -a and check the virutal memory and data seg size limits. Other than that, can you post your allocation code so we can see what's actually going on?

like image 40
Kieron Avatar answered Nov 11 '22 02:11

Kieron


Update:

I have since fixed an array indexing bug and it is allocating properly now.

If I had to guess... I was walking all over my heap and was messing with the malloc's data structures. (??)

like image 22
JeffV Avatar answered Nov 11 '22 02:11

JeffV