I spotted the following behavior. Say I create the following multi-dimensional array:
spam = array(runif(96*48*60*360), dim = c(96,48,60,360))
It is quite predictable how much memory R should use for this, namely (96*48*60*360) * 4 bytes = 759.4 Mbyte. This is nicely confirmed using the lsos
function (see this post):
> lsos()
Type Size PrettySize Rows Columns
spam array 796262520 759.4 Mb 96 48
lsos function 776 776 bytes NA NA
R as a process however uses much more memory, roughly twice the size:
$ top | grep rsession
82:17628 hiemstra 20 0 1614m **1.5g** 8996 S 0.3 40.4 0:04.85 rsession
Why does R do this? I assume the extra reserved memory is allocated to make it more quickly accessible to R? Any thought's?
Because the garbage collector has not run yet.
So there's a lot of garbage, probably generated during the creation of the big array, that has to be cleared.
If you force a garbage collection by calling gc()
function, you will see that the used memory will be pretty near to the size of your array:
> memory.size()
[1] 775.96
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With