system()
or shell()
then dynamically load them?In an R session, I want to create a function that
Call the template C source file template.tmp
and let it look something like:
...
#define VAR
...
(I don't plan on actually compiling template.tmp
, hence the .tmp
file extension. I merely use it as a template to create other source files where only the #define
directive changes.)
Below is the R function, I created to do the steps above. For simplicity, I removed lines of code that declare correct path names or add portability to Windows and other Unix-alikes.
myfunc <- function(val){
# create commands for system()
subst_cmd <- paste0("sed 's/define VAR/define VAR ", val, "/' template.tmp > newprog.c")
shlib_cmd <- paste0("R CMD SHLIB newprog.c")
# submit commands to system()
system(subst_cmd)
system(shlib_cmd)
# dynamically load shared library
dyn.load(newprog.so)
}
Thus, myfunc(12345)
will create a new C source file with the following lines, and will also compile it to a shared library and dynamically load it into the current R session.
...
#define VAR 12345
...
I have a function where speed is very important as the function could be called thousands of times. Passing values to the function in my C source file tremendously slows down the speed when compared to using a #define
preprocessor directive. My thinking was to some how edit the file within R then compile it and load it into the R session. But I do not know if this is good practice or if there is another way to accomplish the same task. One problem I have already encountered occurs when I remote into a computer cluster where the login node has compiling capabilities but the computing nodes do not.
Just add -Dxxx=yy on the command line ( xxx the name of the macro and yy the replacement, or just -Dxxx if there is no value).
The #define creates a macro, which is the association of an identifier or parameterized identifier with a token string. After the macro is defined, the compiler can substitute the token string for each occurrence of the identifier in the source file.
#define is a preprocessor directive that is used to define macros in a C program. #define is also known as a macros directive. #define directive is used to declare some constant values or an expression with a name that can be used throughout our C program.
Macro definitions are not variables and cannot be changed by your program code like variables. You generally use this syntax when creating constants that represent numbers, strings or expressions.
Disclaimer: I dont have much knowledege about R. I am answering this using my experiance of using C shared library.
Is it bad practice to compile shared libraries from inside an R session using system() or shell() then dynamically load them?
Yes it is not a good practise. As you say it may be called 1000s of time changing the file and compiling every time is not a good option.
Is there a "better" way (than the code below) to change the value of a macro variable in a C source file based on user input from an R session?
If the only requirement is changing the value of a macro then why not convert it to a variable and pass that variable when calling the function in dynamically loaded library?
Consider following example:(The code below is for dynamic library)
foo.h
#ifndef foo_h__
#define foo_h__
extern void foo(int var);
#endif // foo_h__
foo.c
#include <stdio.h>
int global_var;
void foo(int var)
{
global_var= var;
}
Call the above function from R by passing the desired value of var to foo function. (I hope you know how to do this)
I don't plan on actually compiling template.tmp, hence the .tmp file extension. I merely use it as a template to create other source files where only the #define directive changes.
I suggest not to do this. Even if you have more then one such macros you can still use above logic to handle it. My suggestion is to compile the library once and then call initialisation function (foo in above example) passing it the required values. This will also avoid loading the library multiple times and hence make it efficient. This is better then what u r doing currently and easy to maintain and document.
File operations are inherently slow if you are concerned about speed and efficiency. Your program may some day become large hence compiling it for every function call will further add some delay in execution. Its bad idea to compile the library every time.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With