Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Bad Alloc with a 200GB memory available c++

I'm new to C++, and I'm studying 'compressive sensing' so I need working with huge matrices, and MATLAB is actually slow so I programmed my algorithm with C++.

The thing is that I store big arrays (around 100Mb-1Gb). They are 20 arrays approx. and it works fine with 30 Gb of memory however when the process needs more than 40Gb it just stops. I think it's a memory problem, I tested it on Linux and Windows (OS 64 bits - compilers 64 bits MinGW - 200Gb Ram - intel Xeon) is there any limitation?.

size_t tm=n*m*l;
double *x=new double[tm];

I use around 20 arrays like this one. n,m ~= 1000 and L ~= 30 those are typically sizes.

Thank you

like image 227
Jonathan Arley Monsalve Salaza Avatar asked May 12 '15 02:05

Jonathan Arley Monsalve Salaza


1 Answers

20 arrays, a problem with 40 GB memory use in total - that suggests that the program breaks when an array exceeds 2 GB. This should not happen, a 64 bits address space should use a 64 bits size_t for object sizes. It appears that MinGW incorrectly uses a 31 bit size (i.e. losing a sign bit as well).

I don't know how you allocate memory, but this is perhaps fixable by bypassing the broken allocation routine and going straight to the OS allocator. E.g. for Windows you could call VirtualAlloc (skip HeapAlloc, it's not designed for such large allocations).

like image 187
MSalters Avatar answered Nov 10 '22 00:11

MSalters