Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

saveRDS inflating size of object

Tags:

r

gzip

save

This is a tricky one as I can't provide a reproducible example, but I'm hoping that others may have had experience dealing with this.

Essentially I have a function that pulls a large quantity of data from a DB, cleans and reduces the size and loops through some parameters to produce a series of lm model objects, parameter values and other reference values. This is compiled into a complex list structure that totals about 10mb.

It's then supposed to saved as an RDS file on AWS s3 where it's retrieved in a production environment to build predictions.

e.g.

db.connection <- db.connection.object


build_model_list <- function(db.connection) {   


clean_and_build_models <- function(db.connection, other.parameters) {


get_db_data <- function(db.connection, some.parameters) {# Retrieve db data} ## Externally defined

db.data <- get_db_data() 


build_models <- function(db.data, some.parameters) ## Externally defined

clean_data <- function(db.data, some.parameters) {# Cleans and filters data based on parameters} ## Externally defined


clean.data <- clean_data() 


lm_model <- function(clean.data) {# Builds lm model based on clean.data} ## Externally defined

lm.model <- lm_model()


return(list(lm.model, other.parameters))} ## Externally defined


looped.model.object <- llply(some.parameters, clean_and_build_models)

return(looped.model.object)}


model.list <- build_model_list()

saveRDS(model.list, "~/a_place/model_list.RDS")

The issue I'm getting is that 'model.list' object which is only 10MB in memory will inflate to many GBs when I save locally as RDS or try to upload to AWS s3.

I should note that though the function processes very large quantities of data (~ 5 million rows), the data used in the outputs is no larger than a few hundred rows.

Reading the limited info on this on Stack Exchange, I've found that moving some of the externally defined functions (as part of a package) inside the main function (e.g. clean_data and lm_model) helps reduce the RDS save size.

This however has some big disadvantages.

Firstly it's trial and error and follows no clear logical order, with frequent crashes and a couple of hours taken to build the list object, it's a very long debugging cycle.

Secondly, it'll mean my main function will be many hundreds of lines long which will make future alterations and debugging much more tricky.

My question to you is:

Has anyone encountered this issue before?

Any hypotheses as to what's causing it?

Has anyone found a logical non-trial-and-error solution to this?

Thanks for your help.

like image 447
IanCognito Avatar asked Feb 14 '17 16:02

IanCognito


2 Answers

It took a bit of digging but I did actually find a solution in the end.

It turns out it was the lm model objects that were the guilty party. Based on this very helpful article:

https://blogs.oracle.com/R/entry/is_the_size_of_your

It turns out that the lm.object$terms component includes a an environment component that references to the objects present in the global environment when the model was built. Under certain circumstances, when you saveRDS R will try and draw in the environmental objects into the save object.

As I had ~0.5GB sitting in the global environment and an list array of ~200 lm model objects, this caused the RDS object to inflate dramatically as it was actually trying to compress ~100GB of data.

To test if this is what's causing the problem. Execute the following code:

as.matrix(lapply(lm.object, function(x) length(serialize(x,NULL)))) 

This will tell you if the $terms component is inflating.

The following code will remove the environmental references from the $terms component:

rm(list=ls(envir = attr(lm.object$terms, ".Environment")), envir = attr(lm.object$terms, ".Environment")) 

Be warned though it'll also remove all the global environmental objects it references.

like image 73
IanCognito Avatar answered Sep 21 '22 13:09

IanCognito


For model objects you could also simply delete the reference to the environment.

As for example like this

ctl <- c(4.17,5.58,5.18,6.11,4.50,4.61,5.17,4.53,5.33,5.14)
trt <- c(4.81,4.17,4.41,3.59,5.87,3.83,6.03,4.89,4.32,4.69)
group <- gl(2, 10, 20, labels = c("Ctl","Trt"))
weight <- c(ctl, trt)
lm.D9 <- lm(weight ~ group) 

attr(lm.D9$terms, ".Environment") <- NULL
saveRDS(lm.D9, file = "path_to_save.RDS")

This unfortunatly breaks the model - but you can add an environment manualy after loading again.

readRDS("path_to_save.RDS")
attr(lm.D9$terms, ".Environment") <- globalenv()

This helped me in my specific use case and looks a bit saver to me...

like image 22
mhwh Avatar answered Sep 18 '22 13:09

mhwh