We are running R in a linux cluster environment. The head node has had a few hangs when a user has inadvertently taken all the memory using an R process. Is there a way to limit R memory usage under linux? I'd rather not suggest global ulimits, but that may be the only way forward.
Use memory. limit() . You can increase the default using this command, memory. limit(size=2500) , where the size is in MB.
Under most 64-bit versions of Windows the limit for a 32-bit build of R is 4Gb: for the oldest ones it is 2Gb. The limit for a 64-bit build of R (imposed by the OS) is 8Tb.
There's unix::rlimit_as()
that allows setting memory limits for a running R process using the same mechanism that is also used for ulimit
in the shell. Windows and macOS not supported.
In my .Rprofile
I have
unix::rlimit_as(1e12, 1e12)
to limit memory usage to ~12 GB.
I had created a small R package, ulimit
with similar functionality.
Install it from GitHub using
devtools::install_github("krlmlr/ulimit")
To limit the memory available to R to 2000 MiB, call:
ulimit::memory_limit(2000)
Now:
> rep(0L, 1e9)
Error: cannot allocate vector of size 3.7 Gb
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With