Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Creation of a large std::array causes segfault?

Tags:

c++

linux

gcc

c++11

I want to create a large std::array and fill it with random data. The problem is that if I declare std::array program segfaults (GDB says it segfaults on auto start..) if i comment array declaration program runs.

Here is SCSE:

#include <array>
#include <cstdint>
#include <iostream>
#include <chrono>

static const constexpr size_t size = 1E7;

int main(){

    auto start = std::chrono::high_resolution_clock::now();
    std::array<uint16_t, size> random_data;
    // Here I want to fill random_data with random numbers to avoid 
    // filling memory twice
    auto end = std::chrono::high_resolution_clock::now();
    std::chrono::duration<double> elapsed = end-start;
    std::cout << "Elapsed sec " << elapsed.count() << std::endl;

}

It is compiled in gcc using -std=gnu++11 on GNU/Linux.

like image 397
jb. Avatar asked Dec 25 '22 02:12

jb.


1 Answers

Your array lies on the stack, and so does its member, the underlying array. But the stack on your machine is presumably not forty megabytes large, so your program crashes.

Use vector.

std::vector<std::uint16_t> random_data(size);

Or, if you want to avoid unnecessary initialization and don't need dynamic change in size, use a unique_ptr holding an array.

std::unique_ptr<std::uint16_t[]> random_data( new std::uint16_t[size] );

Demo.

like image 196
Columbo Avatar answered Jan 04 '23 11:01

Columbo