I am interested in exploring how R can handle data out-of-memory. I've found the bigmemory
package and friends (bigtabulate
and biganalytics
), but was hoping that someone could point me to a worked out example that uses file backing with these packages. Any other out-of-memory tips would also be appreciated.
Charlie, just email Mike and Jay, they have a number of examples working around the ASA 'flights' database example from a year or two ago.
Edit: In fact, the Documentation tab has what I had in mind; the scripts are also on the site.
Take a look at "CRAN Task View: High-Performance and Parallel Computing with R". There is a chapter "Large memory and out-of-memory data" where severel solutions are mentioned. For example package ff
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With