Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Reserved memory of R is twice the size of an allocated array

I spotted the following behavior. Say I create the following multi-dimensional array:

spam = array(runif(96*48*60*360), dim = c(96,48,60,360))

It is quite predictable how much memory R should use for this, namely (96*48*60*360) * 4 bytes = 759.4 Mbyte. This is nicely confirmed using the lsos function (see this post):

> lsos()
         Type      Size PrettySize Rows Columns
spam    array 796262520   759.4 Mb   96      48
lsos function       776  776 bytes   NA      NA

R as a process however uses much more memory, roughly twice the size:

$ top | grep rsession
82:17628 hiemstra  20   0 1614m **1.5g** 8996 S  0.3 40.4   0:04.85 rsession  

Why does R do this? I assume the extra reserved memory is allocated to make it more quickly accessible to R? Any thought's?

like image 582
Paul Hiemstra Avatar asked Jul 26 '12 08:07

Paul Hiemstra


1 Answers

Because the garbage collector has not run yet.
So there's a lot of garbage, probably generated during the creation of the big array, that has to be cleared.

If you force a garbage collection by calling gc() function, you will see that the used memory will be pretty near to the size of your array:

> memory.size()
[1] 775.96
like image 157
digEmAll Avatar answered Nov 07 '22 00:11

digEmAll