Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using parLapply and clusterExport inside a function

I asked a related question here and the response worked well: using parallel's parLapply: unable to access variables within parallel code

The problem is when I try to use the answer inside of the function it won't work as I think it has to the default environment of clusterExport. I've read the vignette and looked at the help file but am approaching this with a very limited knowledge base. The way I used parLapply I expected it to behave similar to lapply but it doesn't appear to.

Here is my attempt:

par.test <- function(text.var, gc.rate=10){      ntv <- length(text.var)     require(parallel)     pos <-  function(i) {         paste(sapply(strsplit(tolower(i), " "), nchar), collapse=" | ")     }     cl <- makeCluster(mc <- getOption("cl.cores", 4))     clusterExport(cl=cl, varlist=c("text.var", "ntv", "gc.rate", "pos"))     parLapply(cl, seq_len(ntv), function(i) {             x <- pos(text.var[i])             if (i%%gc.rate==0) gc()             return(x)         }     ) }  par.test(rep("I like cake and ice cream so much!", 20))  #gives this error message > par.test(rep("I like cake and ice cream so much!", 20)) Error in get(name, envir = envir) : object 'text.var' not found 
like image 289
Tyler Rinker Avatar asked Aug 19 '12 00:08

Tyler Rinker


2 Answers

By default clusterExport looks in the .GlobalEnv for objects to export that are named in varlist. If your objects are not in the .GlobalEnv, you must tell clusterExport in which environment it can find those objects.

You can change your clusterExport to the following (which I didn't test, but you said works in the comments)

clusterExport(cl=cl, varlist=c("text.var", "ntv", "gc.rate", "pos"), envir=environment()) 

This way, it will look in the function's environment for the objects to export.

like image 56
GSee Avatar answered Oct 19 '22 12:10

GSee


Another solution is to include the additional variables as arguments to your function; parLapply exports them too. If 'text.var' is the big data, then it pays to make it the argument that is applied to, rather than an index, because then only the portion of text.var relevant to each worker is exported, rather than the whole object to each worker.

par.test <- function(text.var, gc.rate=10){      require(parallel)     pos <-  function(i) {         paste(sapply(strsplit(tolower(i), " "), nchar), collapse=" | ")     }     cl <- makeCluster(mc <- getOption("cl.cores", 4))     parLapply(cl, text.var, function(text.vari, gc.rate, pos) {         x <- pos(text.vari)         if (i%%gc.rate==0) gc()         x     }, gc.rate, pos) } 

This is also conceptually pleasing. (It's rarely necessary to explicitly invoke the garbage collector).

Memory management when source()ing a script causes additional problems. Compare

> stop("oops") Error: oops > traceback() 1: stop("oops") 

with the same call in a script

> source("foo.R") Error in eval(ei, envir) : oops > traceback() 5: stop("oops") at foo.R#1 4: eval(ei, envir) 3: eval(ei, envir) 2: withVisible(eval(ei, envir)) 1: source("foo.R") 

Remember that R's serialize() function, used internally by parLapply() to move data to workers, serializes everything up to the .GlobalEnv. So data objects created in the script are serialized to the worker, whereas if run interactively they would not be serialized. This may account for @SeldeomSeenSlim's problems when running a script. Probably the solution is to more clearly separate 'data' from 'algorithm', e.g., using the file system or data base or ... to store objects.

like image 37
Martin Morgan Avatar answered Oct 19 '22 10:10

Martin Morgan