Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Efficient memory management in R

I have 6 GB memory in my machine (Windows 7 Pro 64 bit) and in R, I get

> memory.limit()
6141

Of course, when dealing with big data, memory allocation error occurs. So in order to make R to use virtual memory, I use

> memory.limit(50000)

Now, when running my script, I don't have memory allocation error any more, but R hogs all the memory in my computer so I can't use the machine until the script is finished. I wonder if there is a better way to make R manage memory of the machine. I think something it can do is to use virtual memory if it is using physical memory more than user specified. Is there any option like that?

like image 367
Tae-Sung Shin Avatar asked Apr 05 '13 17:04

Tae-Sung Shin


2 Answers

Look at the ff and bigmemory packages. This uses functions that know about R objects to keep them on disk rather than letting the OS (which just knows about chunks of memory, but not what they represent).

like image 113
Greg Snow Avatar answered Oct 28 '22 15:10

Greg Snow


R doesn't manage the memory of the machine. That is the responsibility of the operating system. The only reason memory.size and memory.limit exist on Windows is because (from help("Memory-limits")):

 Under Windows, R imposes limits on the total memory allocation
 available to a single session as the OS provides no way to do so:
 see 'memory.size' and 'memory.limit'.

R objects also have to occupy contiguous space in RAM, so you can run into memory allocation issues with only a few large objects. You could probably be more careful with the number/size of objects you create and avoid using so much memory.

like image 5
Joshua Ulrich Avatar answered Oct 28 '22 16:10

Joshua Ulrich