I'm doing some analysis something like this:
library(plyr)
input.files <- c("file1.txt", "file2.txt", "file3.txt")
input.data <- llply(input.files, load.file, .parallel=TRUE)
step.one.results <- llply(input.data, step.one, .parallel=TRUE)
step.two.results <- llply(step.one.results, step.two, .parallel=TRUE)
...
step.N.results <- llply(`step.N-1.results`, step.N, .parallel=TRUE)
...
Is there any way to make all the plyr functions parallel by default, so I don't always have to specify .parallel=TRUE
for each step?
There are various packages in R which allow parallelization. “parallel” Package The parallel package in R can perform tasks in parallel by providing the ability to allocate cores to R. The working involves finding the number of cores in the system and allocating all of them or a subset to make a cluster.
The parallel package which comes with your R installation. It represents a combining of two historical packages–the multicore and snow packages, and the functions in parallel have overlapping names with those older packages.
plyr is an R package that makes it simple to split data apart, do stuff to it, and mash it back together. This is a common data-manipulation step. Importantly, plyr makes it easy to control the input and output data format from a syntactically consistent set of functions.
library(Defaults)
setDefaults(llply, .parallel=TRUE)
You'd have to setDefaults
on every function for which you want to change the default formals. You can put this in your .Rprofile if you like.
You can also mess with the formals directly. e.g.
formals(llply)$.parallel <- TRUE
should work.
From my answer to another question:
As the Defaults package is no longer available from CRAN, you can use default.
As an example:
x <- list(a = 1, b = 2, c = 3)
default::default(unlist) <- list(use.names = FALSE)
unlist(x)
#> [1] 1 2 3
unlist <- default::reset_default(unlist)
unlist(x)
#> a b c
#> 1 2 3
Created on 2019-03-22 by the reprex package (v0.2.0.9000).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With