Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to directly read a huge chunk of memory into std::vector?

I have a huge contiguous array x that I fread from a file.

How do I drop this chunk into a std::vector<>? In other words, I prefer to have the result to be in std::vector<> rather than the array, but I want the resultant C++ code to be as efficient as this plain C-version which drops the chunk right into the array.

From searching around, I think I may have to use placement-new in some form, but I'm uncertain about the sequence of calls and ownership issues. Also, do I need to worry about alignment issues?

I am testing for with T = unsigned, but I expect a reasonable solution to work for any POD struct.

using T = unsigned;
FILE* fp = fopen( outfile.c_str(), "r" );
T* x = new T[big_n];
fread( x, sizeof(T), big_n, fp );

// how do I get x into std::vector<T> v
// without calling a gazillion push_backs() or copies ?!?

delete[] x;
fclose( fp );
like image 396
kfmfe04 Avatar asked Jan 17 '13 13:01

kfmfe04


1 Answers

You use the std::vector constructor which sets the size of the vector, and use std::vector::data to get a pointer to allocated memory.

Keeping with your use of fread:

std::vector<T> x(big_n);
fread(x.data(), sizeof(T), big_n, fp);

As noted by others, using fread if the type T is not a POD type will most likely not work. You can then use C++ streams and std::istreambuf_iterator to read the file into the vector. However this have the drawback that it loops over all items in the file, and if big_n is as big as it sounds then this might be a performance problem.


However, if the file truly is big, I rather recommend using memory mapping to read the file.

like image 129
Some programmer dude Avatar answered Nov 10 '22 16:11

Some programmer dude