I am new to High Performance Computing
and my first question in this forum where I have been a reader for much long.
Basically I need to do arithmetic operations on very large arrays like for instance
double variable [9][4][300][300][300] (uninitialized)
case 1: If I declare the array above as local/automatic
then I get run-time error if I had compiled without optimization like "g++ file.cpp
" .. (the error is segmentation fault -- stack overflow???)
case 2: In the same case as above if I had compile with optimization the code runs as expected. "g++ -O2 file.cp
p" (is the array
in bss
now???)
case 3: If I make the variable global/static
then it compiles fine but nonetheless it does not run and just gives a message "killed
" on terminal and terminates.
There is no real problem but I am curious and want to learn what happens when extremely large arrays are declared and where do they reside in memory depending on their data type.
I am also aware of the method of generating these arrays at run-time using malloc or new. Then of course it would be on heap.
So the most important question for me is --> which is the most efficient method (i.e. smallest run-time during computation of arrays in memory) of dealing with large arrays when compiling with g++
and running on linux clusters
.
Thank you for your patience for reading.
Local variables will always be on the stack, no matter optimization flags. And that array will be around 7 gigabyte! Way larger than any possible stack.
The size may also be a reason it doesn't start, as if you put it as a global/static variable then you need to have more than 7 GB or virtual memory free and contiguous to be able to even load the program.
May I suggest something along the lines of:
typedef double slice[300][300][300];
std::vector<slice> variable[9] = { 4, 4, 4, 4, 4, 4, 4, 4, 4 };
This way each vector of 4 slice
objects will be dynamically allocated, the contents of the 9 vectors need not be contiguous with each other, and the stack consumption is only enough for metadata for 9 vectors.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With