Let me say first that I've read Writing R Extensions, the Rcpp package vignette, and that I've built a package from Rcpp.package.skeleton()
.
Since building my package, I added a function, multiGenerateCSVrow()
, and then ran compileAttributes()
on the package directory before R CMD build/R CMD install. After I load my package, I can run my function either directly or via foreach()
with the %do%
method.
When I try to run in parallel however, I get an error:
cl <- makePSOCKcluster(8)
registerDoParallel(cl)
rows <- foreach(i=1:8,.combine=rbind,.packages="myPackage") %dopar% multiGenerateCSVrow(scoreMatrix=NIsample,
validMatrix = matrix(1,nrow=10,ncol=10),
cutoffVector = rep(0,10),
factorVector = randomsCutPlus1[i,],
actualVector = rep(1,10),
scaleSample = 1)
stopCluster(cl)
~
Error in multiGenerateCSVrow(scoreMatrix = NIsample, validMatrix = matrix(1, :
task 1 failed - "NULL value passed as symbol address"
Here's the package NAMESPACE:
# Generated by roxygen2 (4.0.1): do not edit by hand
useDynLib(myPackage)
exportPattern("^[[:alpha:]]+")
importFrom(Rcpp, evalCpp)
Here's the relevant chunk of RcppExports.cpp:
// multiGenerateCSVrow
SEXP multiGenerateCSVrow(SEXP scoreMatrix, SEXP validMatrix, SEXP cutoffVector, SEXP factorVector, SEXP actualVector, SEXP scaleSample);
RcppExport SEXP myPackage_multiGenerateCSVrow(SEXP scoreMatrixSEXP, SEXP validMatrixSEXP, SEXP cutoffVectorSEXP, SEXP factorVectorSEXP, SEXP actualVectorSEXP, SEXP scaleSampleSEXP) {
BEGIN_RCPP
SEXP __sexp_result;
{
Rcpp::RNGScope __rngScope;
Rcpp::traits::input_parameter< SEXP >::type scoreMatrix(scoreMatrixSEXP );
Rcpp::traits::input_parameter< SEXP >::type validMatrix(validMatrixSEXP );
Rcpp::traits::input_parameter< SEXP >::type cutoffVector(cutoffVectorSEXP );
Rcpp::traits::input_parameter< SEXP >::type factorVector(factorVectorSEXP );
Rcpp::traits::input_parameter< SEXP >::type actualVector(actualVectorSEXP );
Rcpp::traits::input_parameter< SEXP >::type scaleSample(scaleSampleSEXP );
SEXP __result = multiGenerateCSVrow(scoreMatrix, validMatrix, cutoffVector, factorVector, actualVector, scaleSample);
PROTECT(__sexp_result = Rcpp::wrap(__result));
}
UNPROTECT(1);
return __sexp_result;
END_RCPP
}
And RcppExports.R:
multiGenerateCSVrow <- function(scoreMatrix, validMatrix, cutoffVector, factorVector, actualVector, scaleSample) {
.Call('myPackage_multiGenerateCSVrow', PACKAGE = 'myPackage', scoreMatrix, validMatrix, cutoffVector, factorVector, actualVector, scaleSample)
}
What could it be looking for?
I had a similar problem and I solved it by adding .noexport = c(<Functions that were implemented in C++>)
to the foreach
.
I am guessing these functions get imported from the global environment into the parallel contexts, but, since they are not ordinary functions, they don't actually work. This does mean the functions have to be loaded separately on each node; in my case that was a SNOW clusterCall()
call that sourced various files including the C++ code.
I also had the problem that functions using Rcpp would not work within foreach. As suggested by Patrick McCarthy, I put the function in a package, installed&loaded the package and passed it in forearch with .packages=("...").
I still got some errors, but that was resolved after updating all involved packages.
(I would have commented, but I do not have enough reputation and I thought this might be helpful for some people)
Inspired by answers from @henine & @jmb, I tried the "reverse" option, which is that I actually source my R file with the Rccp functions inside my foreach loop and make sure to include "Rccp" in the .packages option of foreach. Might not be the most efficient, but does the job & is simple.
Something like:
cl = makeCluster(n_cores, outfile="")
registerDoParallel(cl)
foreach(n = 1:N,.packages = "Rcpp",.noexport = "<name of Rccp function>")%dopar%{
source("Scripts/Rccp_functions.R")
### do stuff with functions scripted in Rccp_functions.R
}
stopImplicitCluster()
And similarly to @jmb, I would have commented, but don't have enough reputation :D
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With