Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there a limiltation of simultaneous filestreams?

Tags:

c++

fstream

I have a strategic question to the use of simultaneously opened fstreams. I have to write a program which has too read a large amount of files. In each file there are information to a bunch of identifiers, but one time. I have to compute this information and than save it for each identifier in a separate file. Every identifier appears in several files and should be saved every time in the same file (One identifier with many times). I expect some hundred identifiers so I doubt I should have several hundred filesteams open simultaneously.

So is there a limitation of simultaneous filestreams? Or do you propose another way of doing this?

The program will compute a massive amount of data (about 10GB or larger) and perhaps computes several hours.

Thanks

like image 262
FThewes Avatar asked Jul 04 '13 14:07

FThewes


1 Answers

There's ultimately a limit to anything. Files are a perfect example of something managed by the operating system, and you will have to consult your OS documentation for the specific limit. In Linux, I believe it is configurable in the kernel. There may additionally be user and process quotas.

I don't think 200 is too many to ask.

It's quite simple to try and see. Just write a program that keeps opening more files until you get an error.

Live example.

On Mac OS X 10.8, this program

#include <iostream>
#include <fstream>
#include <iomanip>
#include <string>

int main() {
    int i = 0;
    std::ofstream *f;
    do {
        f = new std::ofstream( std::to_string( i ++ ) );
    } while ( * f << "hello" << std::flush );
    -- i; // Don't count last iteration, which failed to open anything.

    std::cout << i << '\n';
}

Produces the output 253. So if you're on a Mac, you're golden :) .

like image 148
Potatoswatter Avatar answered Oct 27 '22 01:10

Potatoswatter