Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Memory Allocation "Error: cannot allocate vector of size 75.1 Mb" [duplicate]

In the course of vectorizing some simulation code, I've run into a memory issue. I'm using 32 bit R version 2.15.0 (via RStudio version 0.96.122) under Windows XP. My machine has 3.46 GB of RAM.

> sessionInfo()
R version 2.15.0 (2012-03-30)
Platform: i386-pc-mingw32/i386 (32-bit)

locale:
[1] LC_COLLATE=English_United Kingdom.1252  LC_CTYPE=English_United Kingdom.1252   
[3] LC_MONETARY=English_United Kingdom.1252 LC_NUMERIC=C                           
[5] LC_TIME=English_United Kingdom.1252    

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
[1] Matrix_1.0-6   lattice_0.20-6 MASS_7.3-18   

loaded via a namespace (and not attached):
[1] grid_2.15.0  tools_2.15.0

Here is a minimal example of the problem:

> memory.limit(3000)
[1] 3000
> rm(list = ls())
> gc()
          used (Mb) gc trigger  (Mb)  max used   (Mb)
Ncells 1069761 28.6    1710298  45.7   1710298   45.7
Vcells  901466  6.9   21692001 165.5 173386187 1322.9
> N <- 894993
> library(MASS)
> sims <- mvrnorm(n = N, mu = rep(0, 11), Sigma = diag(nrow = 11))
> sims <- mvrnorm(n = N + 1, mu = rep(0, 11), Sigma = diag(nrow = 11))
Error: cannot allocate vector of size 75.1 Mb

(In my application the covariance matrix Sigma is not diagonal, but I get the same error either way.)

I've spent the afternoon reading about memory allocation issues in R (including here, here and here). From what I've read, I get the impression that it's not a matter of the available RAM per se, but of the available continuous address space. Still, 75.1Mb seems pretty small to me.

I'd greatly appreciate any thoughts or suggestions that you might have.

like image 799
inhuretnakht Avatar asked Jun 06 '12 15:06

inhuretnakht


3 Answers

I had the same warning using the raster package.

> my_mask[my_mask[] != 1] <- NA
Error: cannot allocate vector of size 5.4 Gb

The solution is really simple and consist in increasing the storage capacity of R, here the code line:

##To know the current storage capacity
> memory.limit()
[1] 8103
## To increase the storage capacity
> memory.limit(size=56000)
[1] 56000    
## I did this to increase my storage capacity to 7GB

Hopefully, this will help you to solve the problem Cheers

like image 138
juandelsur Avatar answered Nov 12 '22 13:11

juandelsur


R has gotten to the point where the OS cannot allocate it another 75.1Mb chunk of RAM. That is the size of memory chunk required to do the next sub-operation. It is not a statement about the amount of contiguous RAM required to complete the entire process. By this point, all your available RAM is exhausted but you need more memory to continue and the OS is unable to make more RAM available to R.

Potential solutions to this are manifold. The obvious one is get hold of a 64-bit machine with more RAM. I forget the details but IIRC on 32-bit Windows, any single process can only use a limited amount of RAM (2GB?) and regardless Windows will retain a chunk of memory for itself, so the RAM available to R will be somewhat less than the 3.4Gb you have. On 64-bit Windows R will be able to use more RAM and the maximum amount of RAM you can fit/install will be increased.

If that is not possible, then consider an alternative approach; perhaps do your simulations in batches with the n per batch much smaller than N. That way you can draw a much smaller number of simulations, do whatever you wanted, collect results, then repeat this process until you have done sufficient simulations. You don't show what N is, but I suspect it is big, so try smaller N a number of times to give you N over-all.

like image 27
Gavin Simpson Avatar answered Nov 12 '22 14:11

Gavin Simpson


gc() can help

saving data as .RData, closing, re-opening R, and loading the RData can help.

see my answer here: https://stackoverflow.com/a/24754706/190791 for more details

like image 2
Timothée HENRY Avatar answered Nov 12 '22 13:11

Timothée HENRY