Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

running multiple jobs in background at same time (parallel) in r

I want to run a program which need substantial time. I want to write a function that can run in parallel (I am graphical interface user in windows). The function divides the task into n sub-tasks and performs a final consensus task. I want to run n task at parallel( same time within same program window) and then combine the outputs. The following just an example:

ptm <- proc.time()
j1 <- cov(mtcars[1:10,], use="complete.obs") # job 1
j2 <- cov(mtcars[11:20,], use="complete.obs") # job 2
j3 <- cov(mtcars[21:32,], use="complete.obs") # job 3
proc.time() - ptm

out <- list (j1 = j1, j2 = j2, j3 = j3) 

I know in unix "&" usually allows the jobs to run in background. Is there similar way in R

like image 885
jon Avatar asked May 30 '12 11:05

jon


1 Answers

You can use mclapply or clusterApply to launch several functions in parallel. They are not really in the background: R will wait until they are all finished (as if you were using wait, in a Unix shell, after launching the processes in the background).

library(parallel)
tasks <- list(
  job1 = function() cov(mtcars[1:10,],  use="complete.obs"),
  job2 = function() cov(mtcars[11:20,], use="complete.obs"),
  job3 = function() cov(mtcars[21:32,], use="complete.obs"),
  # To check that the computations are indeed running in parallel.
  job4 = function() for (i in 1:5) { cat("4"); Sys.sleep(1) },
  job5 = function() for (i in 1:5) { cat("5"); Sys.sleep(1) },
  job6 = function() for (i in 1:5) { cat("6"); Sys.sleep(1) }
)

# Using fork()
out <- mclapply( 
  tasks, 
  function(f) f(), 
  mc.cores = length(tasks) 
)

# Equivalently: create a cluster and destroy it.
# (This may work on Windows as well.)
cl <- makeCluster( length(tasks) )
out <- clusterApply( 
  cl,
  tasks,
  function(f) f()
)
stopCluster(cl)
like image 98
Vincent Zoonekynd Avatar answered Nov 03 '22 21:11

Vincent Zoonekynd