Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

limiting memory usage in R under linux

We are running R in a linux cluster environment. The head node has had a few hangs when a user has inadvertently taken all the memory using an R process. Is there a way to limit R memory usage under linux? I'd rather not suggest global ulimits, but that may be the only way forward.

like image 381
seandavi Avatar asked Sep 25 '12 12:09

seandavi


People also ask

How do I limit memory usage in R?

Use memory. limit() . You can increase the default using this command, memory. limit(size=2500) , where the size is in MB.

Does R have a memory limit?

Under most 64-bit versions of Windows the limit for a 32-bit build of R is 4Gb: for the oldest ones it is 2Gb. The limit for a 64-bit build of R (imposed by the OS) is 8Tb.


1 Answers

There's unix::rlimit_as() that allows setting memory limits for a running R process using the same mechanism that is also used for ulimit in the shell. Windows and macOS not supported.

In my .Rprofile I have

unix::rlimit_as(1e12, 1e12)

to limit memory usage to ~12 GB.

Before that...

I had created a small R package, ulimit with similar functionality.

Install it from GitHub using

devtools::install_github("krlmlr/ulimit")

To limit the memory available to R to 2000 MiB, call:

ulimit::memory_limit(2000)

Now:

> rep(0L, 1e9)
Error: cannot allocate vector of size 3.7 Gb
like image 76
krlmlr Avatar answered Sep 20 '22 17:09

krlmlr