I'm doing an iterative computation on a flow network during which I need to record how much each source contributes to the flow on each edge. The flow on any one edge is due to 2% of the sources on average, so I define vector< map<int, double> > flow, where flow[e][s] = f means that the flow on edge e due to source s is f. At each iteration, every f in flow gets updated.
The program's peak memory usage gets close to 4 GB. This works on (32-bit) Linux and OS X, but it crashes on Windows (which seems to impose a 2 GB per process limit).
How can I implement a disk-based data structure with a vector< map<int, double> > interface (or otherwise get around this problem)?
I have used STXXL for similar type scenarios. It might worth looking into.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With