Check your current limit in your R session by using memory. limit() then increase the size appropriately with the command memory. limit(size). For example if your current limit is 8000, increase it 3 time to 24000 with memory.
You can use the function memory. limit(size=...) to increase the amount of memory allocated to R, and that should fix the problem.
The “cannot allocate vector of size” error message is a memory allocation problem that can arise when dealing with a large amount of data. This does not necessarily involve a coding mistake in your R script but there are situations where there is a coding solution.
The minimum is currently 32Mb. If 32-bit R is run on most 64-bit versions of Windows the maximum value of obtainable memory is just under 4Gb. For a 64-bit versions of R under 64-bit Windows the limit is currently 8Tb.
Consider whether you really need all this data explicitly, or can the matrix be sparse? There is good support in R (see Matrix
package for e.g.) for sparse matrices.
Keep all other processes and objects in R to a minimum when you need to make objects of this size. Use gc()
to clear now unused memory, or, better only create the object you need in one session.
If the above cannot help, get a 64-bit machine with as much RAM as you can afford, and install 64-bit R.
If you cannot do that there are many online services for remote computing.
If you cannot do that the memory-mapping tools like package ff
(or bigmemory
as Sascha mentions) will help you build a new solution. In my limited experience ff
is the more advanced package, but you should read the High Performance Computing
topic on CRAN Task Views.
For Windows users, the following helped me a lot to understand some memory limitations:
gc()
to do garbage collection => it works, I can see the memory use go down to 2 GBAdditional advice that works on my machine:
I followed to the help page of memory.limit
and found out that on my computer R by default can use up to ~ 1.5 GB of RAM and that the user can increase this limit. Using the following code,
>memory.limit()
[1] 1535.875
> memory.limit(size=1800)
helped me to solve my problem.
Here is a presentation on this topic that you might find interesting:
http://www.bytemining.com/2010/08/taking-r-to-the-limit-part-ii-large-datasets-in-r/
I haven't tried the discussed things myself, but the bigmemory
package seems very useful
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With