Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Long Vector Not Supported Yet Error in R Windows 64bit version

I'm trying to test what the memory limitations in the current R version is.

runtest <- function(size) {
  x <- "testme"
  while(0<1) {
    x <- c(x, x)
    size <<- object.size(x)  # size of x when fail
  }
}

By running runtest(size) in the console on my laptop, I get the following error:

> runtest(size)
Error: cannot allocate vector of size 4.0 Gb
In addition: Warning messages:
1: In structure(.Call(C_objectSize, x), class = "object_size") :
  Reached total allocation of 7915Mb: see help(memory.size)
2: In structure(.Call(C_objectSize, x), class = "object_size") :
  Reached total allocation of 7915Mb: see help(memory.size)
3: In structure(.Call(C_objectSize, x), class = "object_size") :
  Reached total allocation of 7915Mb: see help(memory.size)
4: In structure(.Call(C_objectSize, x), class = "object_size") :
  Reached total allocation of 7915Mb: see help(memory.size)
> size
2147483736 bytes
> 

This size looks very close to the 2^31-1 limit that people have mentioned before. So then I tried running the same code on our upgraded desktop with 128GB of RAM and set the variable in the shortcut for the 64 bit version to the max memory usage of 100GB. This is the new error I get:

Error in structure(.Call(C_objectSize, ), class = "object_size"):
  long vectors not supported yet: unique.c: 1720
> size
8589934680 bytes
>

Does this 8.5GB limit have anything to do with running in Windows O/S (specifically Windows 7 Enterprise edition)? I think the R help file (http://stat.ethz.ch/R-manual/R-devel/library/base/html/Memory-limits.html) explains this, but I'm having trouble understanding what it's saying (not my area of expertise).

like image 767
lolatu2 Avatar asked Dec 20 '22 19:12

lolatu2


1 Answers

Looking at the source of size.c and unique.c it looks like the hashing used to improve object.size doesn't support long vectors yet:

/* Use hashing to improve object.size. Here we want equal CHARSXPs,
   not equal contents. */

and

/*  Currently the hash table is implemented as a (signed) integer
    array.  So there are two 31-bit restrictions, the length of the
    array and the values.  The values are initially NIL (-1).  O-based
    indices are inserted by isDuplicated, and invalidated by setting
    to NA_INTEGER.
*/

Therefore, it is object.size that is choking. How about calling numeric(2^36) to see if you can create a such a large object, (should be 64GB).

like image 109
James Avatar answered Dec 22 '22 11:12

James