I think meta-programming is the right term here.
I want to be able to use data.table much like one would use MySQL in say a webapp. That is, web users use some web front-end (like Shiny server for example) to select a data-base, select columns to filter on, select columns to group-by, select columns to aggregate and aggregation functions. I want to use R and data.table as a backend for querying, aggregation etc. Assume that front end exists and R has these variables as character strings and they are validated etc.
I wrote the following function to build the data.table expression and use the parse/eval meta-programming functionality of R to run it. Is this a reasonable way to do this?
I includes all relevant code to test this. Source this code (after reading it for security!) and run test_agg_meta() to test it. It is just a start. I could add more functionality.
But my main question is whether I am grossly over-thinking this. Is there is a more direct way to use data.table when all of the inputs are undetermined before hand without resorting to parse/eval meta-programming?
I am also aware of the "with" statement and some of the other sugarless-functional methods but don't know if they can take care of all cases.
require(data.table)
fake_data<-function(num=12){
#make some fake data
x=1:num
lets=letters[1:num]
data=data.table(
u=rep(c("A","B","C"),floor(num/3)),
v=x %%2, w=lets, x=x, y=x^2, z=1-x)
return(data)
}
data_table_meta<-function(
#aggregate a data.table meta-programmatically
data_in=fake_data(),
filter_cols=NULL,
filter_min=NULL,
filter_max=NULL,
groupby_cols=NULL,
agg_cols=setdiff(names(data_in),groupby_cols),
agg_funcs=NULL,
verbose=F,
validate=T,
jsep="_"
){
all_cols=names(data_in)
if (validate) {
stopifnot(length(filter_cols) == length(filter_min))
stopifnot(length(filter_cols) == length(filter_max))
stopifnot(filter_cols %in% all_cols)
stopifnot(groupby_cols %in% all_cols)
stopifnot(length(intersect(agg_cols,groupby_cols)) == 0)
stopifnot((length(agg_cols) == length(agg_funcs)) | (length(agg_funcs)==1) | (length(agg_funcs)==0))
}
#build the command
#defaults
i_filter=""
j_select=""
n_agg_funcs=length(agg_funcs)
n_agg_cols=length(agg_cols)
n_groupby_cols=length(groupby_cols)
if (n_agg_funcs == 0) {
#NULL
print("NULL")
j_select=paste(agg_cols,collapse=",")
j_select=paste("list(",j_select,")")
} else {
agg_names=paste(agg_funcs,agg_cols,sep=jsep)
jsels=paste(agg_names,"=",agg_funcs,"(",agg_cols,")",sep="")
if (n_groupby_cols>0) jsels=c(jsels,"N_Rows_Aggregated=.N")
j_select=paste(jsels,collapse=",")
j_select=paste("list(",j_select,")")
}
groupby=""
if (n_groupby_cols>0) {
groupby=paste(groupby_cols,collapse=",")
groupby=paste("by=list(",groupby,")",sep="")
}
n_filter_cols=length(filter_cols)
if (n_filter_cols > 0) {
i_filters=rep("",n_filter_cols)
for (i in 1:n_filter_cols) {
i_filters[i]=paste(" (",filter_cols[i]," >= ",filter_min[i]," & ",filter_cols[i]," <= ",filter_max[i],") ",sep="")
}
i_filter=paste(i_filters,collapse="&")
}
command=paste("data_in[",i_filter,",",j_select,",",groupby,"]",sep="")
if (verbose == 2) {
print("all_cols:")
print(all_cols)
print("filter_cols:")
print(filter_cols)
print("agg_cols:")
print(agg_cols)
print("filter_min:")
print(filter_min)
print("filter_max:")
print(filter_max)
print("groupby_cols:")
print(groupby_cols)
print("agg_cols:")
print(agg_cols)
print("agg_funcs:")
print(agg_funcs)
print("i_filter")
print(i_filter)
print("j_select")
print(j_select)
print("groupby")
print(groupby)
print("command")
print(command)
}
print(paste("evaluating command:",command))
eval(parse(text=command))
}
my_agg<-function(data=fake_data()){
data_out=data[
i=x<=5,
j=list(
mean_x=mean(x),
mean_y=mean(y),
sum_z=sum(z),
N_Rows_Aggregated=.N
),
by=list(u,v)]
return(data_out)
}
my_agg_meta<-function(data=fake_data()){
#should give same results as my_agg
data_out=data_table_meta(data,
filter_cols=c("x"),
filter_min=c(-10000),
filter_max=c(5),
groupby_cols=c("u","v"),
agg_cols=c("x","y","z"),
agg_funcs=c("mean","mean","sum"),
verbose=T,
validate=T,
jsep="_")
return(data_out)
}
test_agg_meta<-function(){
stopifnot(all(my_agg()==my_agg_meta()))
print("Congrats, you passed the test")
}
While your functions certainly look interesting, I believe you are asking if there are other ways to go about it.
Personally, I like to use something like this:
## SAMPLE DATA
DT1 <- data.table(id=sample(LETTERS[1:4], 20, TRUE), Col1=1:20, Col2=rnorm(20))
DT2 <- data.table(id=sample(LETTERS[3:8], 20, TRUE), Col1=sample(100:500, 20), Col2=rnorm(20))
DT3 <- data.table(id=sample(LETTERS[19:20], 20, TRUE), Col1=sample(100:500, 20), Col2=rnorm(20))
This is straightforward, much like any object in R
# use strings to select the table
tablesSelected <- "DT3"
# use get to access them
get(tablesSelected)
# and we can perform operations:
get(tablesSelected)[, list(C1mean=mean(Col1), C2mean=mean(Col2))]
To select columns by reference to their names, use the .SDcols
argument.
Given a vector of column names:
columnsSelected <- c("Col1", "Col2")
Assign that vector to the .SDcols argument:
## Here we are simply accessing those columns
DT3[, .SD, .SDcols = columnsSelected]
We can also apply a function to each column named in the string vector:
## apply a function to each column
DT3[, lapply(.SD, mean), .SDcols = columnsSelected]
Note that if our goal is simply to output the columns we can turn off with
:
# This works for displaying
DT3[, columnsSelected, with=FALSE]
Note: a more "modern" way of doing this is to use the ..
shortcut to access columnsSelected
from "up one level":
DT3[ , ..columnsSelected]
However, if using with=FALSE
, we cannot then operate directly on the columns in the usual fashion
## This does NOT work:
DT3[, someFunc(columnsSelected), with=FALSE]
## This DOES work:
DT3[, someFunc(.SD), .SDcols=columnsSelected]
## This also works, but is less ideal, ie assigning to new columns is more cumbersome
DT3[, columnsSelected, with=FALSE][, someFunc(.SD)]
We can also use get
, but it is a bit trickier.
I am leaving it here for reference, but .SDcols
is the way to go
## we need to use `get`, but inside `j`
## AND IN A WRAPPER FUNCTION <~~~~~ THIS IS VITAL
DT3[, lapply(columnsSelected, function(.col) get(.col))]
## We can execute functions on the columns:
DT3[, lapply(columnsSelected, function(.col) mean( get(.col) ))]
## And of course, we can use more involved-functions, much like any *ply call:
# using .SDcols
DT3[, lapply(.SD, function(.col) c(mean(.col) + 2*sd(.col), mean(.col) - 2*sd(.col))), .SDcols = columnsSelected]
# using `get` and assigning the value to a var.
# Note that this method has memory drawbacks, so using .SDcols is preferred
DT3[, lapply(columnsSelected, function(.col) {TheCol <- get(.col); c(mean(TheCol) + 2*sd(TheCol), mean(TheCol) - 2*sd(TheCol))})]
For reference, if you try the following, you will notice that they do not produce the results we are after.
## this DOES NOT work (need ..columnsSelected)
DT3[, columnsSelected]
## netiher does this
DT3[, eval(columnsSelected)]
## still does not work:
DT3[, lapply(columnsSelected, get)]
If you want to change the name of the columns:
# Using the `.SDcols` method: change names using `setnames` (lowercase "n")
DT3[, setnames(.SD, c("new.Name1", "new.Name2")), .SDcols =columnsSelected]
# Using the `get` method:
## The names of the new columns will be the names of the `columnsSelected` vector
## Thus, if we want to preserve the names, use the following:
names(columnsSelected) <- columnsSelected
DT3[, lapply(columnsSelected, function(.col) get(.col))]
## we can also use this trick to give the columns new names
names(columnsSelected) <- c("new.Name1", "new.Name2")
DT3[, lapply(columnsSelected, function(.col) get(.col))]
Clearly, using .SDcols is easier and more elegant.
by
?# `by` is straight forward, you can use a vector of strings in the `by` argument.
# lets add another column to show how to use two columns in `by`
DT3[, secondID := sample(letters[1:2], 20, TRUE)]
# here is our string vector:
byCols <- c("id", "secondID")
# and here is our call
DT3[, lapply(columnsSelected, function(.col) mean(get(.col))), by=byCols]
We can access the data.table by reference to its name and then select its columns also by name:
get(tablesSelected)[, .SD, .SDcols=columnsSelected]
## OR WITH MULTIPLE TABLES
tablesSelected <- c("DT1", "DT3")
lapply(tablesSelected, function(.T) get(.T)[, .SD, .SDcols=columnsSelected])
# we may want to name the vector for neatness, since
# the resulting list inherits the names.
names(tablesSelected) <- tablesSelected
Since so much in data.table
is pass-by-reference, it is easy to have a list of tables, a separate list of columns to add and yet another list of columns to operate on, and put all together to add perform similar operations -- but with different inputs -- on all your tables.
As opposed to doing something similar with data.frame
, there is no need to reassign the end result.
newColumnsToAdd <- c("UpperBound", "LowerBound")
FunctionToExecute <- function(vec) c(mean(vec) - 2*sd(vec), mean(vec) + 2*sd(vec))
# note the list of column names per table!
columnsUsingPerTable <- list("DT1" = "Col1", DT2 = "Col2", DT3 = "Col1")
tablesSelected <- names(columnsUsingPerTable)
byCols <- c("id")
# TADA:
dummyVar <- # I use `dummyVar` because I do not want to display the output
lapply(tablesSelected, function(.T)
get(.T)[, c(newColumnsToAdd) := lapply(.SD, FunctionToExecute), .SDcols=columnsUsingPerTable[[.T]], by=byCols ] )
# Take a look at the tables now:
DT1
DT2
DT3
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With