Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

C++ Reading File into Char Array

I am using following code to read a file into chararcter array. Now, for small file (say for 2 MB) it is executing properly but for large file (140 MB), in my 18 GB UBUNTU server it is giving segmentation fault. Can anybody help me how to solve this ? I think 18 GB is enough to hold a 240 MB file into memory. I am using 64 bit UBUNTU and compiling using g++.

ifstream is;

char chararray [fileSize] ;

is.read(chararray, fileSize) ;
like image 821
user1838343 Avatar asked Feb 19 '23 10:02

user1838343


2 Answers

If the array is a local variable you will get a stack overflow, as it will not fit on the stack. Allocate the "array" on the heap instead, either directly using new or indirectly by using std::vector.

Or use memory mapping. See the mmap function.

like image 178
Some programmer dude Avatar answered Feb 21 '23 01:02

Some programmer dude


Instead of allocating the char array on the stack, I'd try using std::vector, which will allocate dynamically on the heap:

std::vector<char> buffer(fileSize);
is.read(&buffer[0], fileSize);
like image 45
Mr.C64 Avatar answered Feb 21 '23 00:02

Mr.C64