I want to make a function that imports data in different numbers of batches depending on how much RAM is available on someone's system. But how can I find the amount of available RAM in R? I can use memory.size()
but that only works for Windows.
size/limit: If 32-bit R is run on most 64-bit versions of Windows the maximum value of obtainable memory is just under 4Gb. For a 64-bit versions of R under 64-bit Windows the limit is currently 8Tb. memory.
Go about your work as normal, and if the computer begins to slow down, press Ctrl+Shift+Esc to bring up Windows Task Manager. Click the Performance tab and select Memory in the sidebar to see a graph of your current RAM usage.
R is designed as an in-memory application: all of the data you work with must be hosted in the RAM of the machine you're running R on. This optimizes performance and flexibility, but does place contraints on the size of data you're working with (since it must all work in RAM).
Since then, RStudio absorbs a very large portion of my laptop RAM even for very simple tasks. Simply having the app open absorbs about 1GB of RAM. If I perform simple tasks (e.g. reading a 500 MB csv file and do some plots) it easily consumes ~ 2GB of RAM.
Given the warnings concerning platform-dependency discussed in the earlier comment, you could for example parse /proc/meminfo
on Linux:
$ grep MemFree /proc/meminfo MemFree: 573660 kB $ awk '/MemFree/ {print $2}' /proc/meminfo 565464
You could try the second approach via system(..., intern=TRUE)
, or even via a pipe connection.
Edit some 5+ years later: In R, and just following what the previous paragraph hinted at:
R> memfree <- as.numeric(system("awk '/MemFree/ {print $2}' /proc/meminfo", + intern=TRUE)) R> memfree [1] 3342480 R>
Now you can do that with benchmarkme::get_ram
function.
https://cran.r-project.org/web/packages/benchmarkme/benchmarkme.pdf
I would recommend using memuse::Sys.meminfo()
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With