Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Resizing QByteArray throws std::bad_alloc when only using 600 MB of memory

Tags:

c++

windows

qt

I am new to Qt and need to load and process some large files. Instead I am running out of memory. The following code illustrates my problem:

QByteArray mem;
for(int i=1; i<=20; ++i)
{
    std::cout << "eating " << (i * 100) << "MB";
    mem.resize(i * 100 * 1024 * 1024);
}

I am getting std::bad_alloc when it reaches 600MB. That really should not happen. Is there a secret switch to increase the heap size?

I am using Qt 5.0.2 on Windows and the Visual C++ 10.0 x86 compiler.

like image 834
MadDave Avatar asked Mar 24 '23 11:03

MadDave


1 Answers

AFAIK QByteArray allocates a continuous block of memory. While your application might still have plenty of virtual memory available, there is a good chance that the current block of memory that your array is allocated in cannot be extended any further because your memory manager doesn't have a contiguous block that is large enough.

If you need to process some large files, instead of allocating memory and loading them into memory in one chunk, I would recommend looking at memory mapping a "viewport" into the file and process it that way. Depending on the size of the file, you might well be able to memory map the whole file into memory in one chunk. That's also more efficient on Windows than loading the file byte by byte as it makes use of the virtual memory system to page in the relevant file.

like image 185
Timo Geusch Avatar answered Apr 05 '23 22:04

Timo Geusch