Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Regression in R -- 4 features, 4 million instances

I have a text file in the form ( User Id, Movie Id, Ratings, Time) and I want to do a vanilla regression on the dataset .( Just 4 features, >4 million instances)

model <- glm ( UserId ~ MovieId+Ratings+Time,data=<name>) 

It gave an error :

ERROR: cannot allocate 138.5MB vector . 

The size of the file is just 93MB. How do I do regression with R and not have memory problems ? Should I store the data differently ?

Thanks .

Some more info : Working on a linux box with 3GB of RAM. I have googled around but most links I have got talk about datasets which are generally > RAM, which in my case is not true :( ( just 93MB) .

like image 444
crazyaboutliv Avatar asked Jul 13 '11 12:07

crazyaboutliv


2 Answers

biglm is a package specifically designed for fitting regression models to large data sets.

It works by processing the data block-by-block. The amount of memory it requires is a function of the number of variables, but is not a function of the number of observations.

like image 124
NPE Avatar answered Oct 04 '22 22:10

NPE


The model matrix required has the same number of rows as your data, but the number of columns in it is roughly the number of unique strings (factor levels)!

So if you have 1000 movies that will generate roughly a 4e6x1000 matrix of doubles. That's around 32 GB...

You can try to generate the model matrix separately like this:

# Sample of 100 rows, 10 users, 20 movies
d <- data.frame(UserId = rep(paste('U', 10), each=10),
                MovieId=sample(paste('M', 1:20), 100, replace=T),
                Ratings=runif(100), Time=runif(100, 45, 180))
dim(d) # 100 x 4
m <- model.matrix(~ MovieId+Ratings+Time, data=d)
dim(m) # 100 x 21
like image 23
Tommy Avatar answered Oct 04 '22 23:10

Tommy