Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Aggregate a Spark data frame using an array of column names, retaining the names

I would like to aggregate a Spark data frame using an array of column names as input, and at the same time retain the original names of the columns.

df.groupBy($"id").sum(colNames:_*)

This works but fails to preserve the names. Inspired by the answer found here I unsucessfully tried this:

df.groupBy($"id").agg(sum(colNames:_*).alias(colNames:_*))
error: no `: _*' annotation allowed here

It works to take a single element like

df.groupBy($"id").agg(sum(colNames(2)).alias(colNames(2)))

How can make this happen for the entire array?

like image 672
johnblund Avatar asked Feb 06 '23 04:02

johnblund


1 Answers

Just provide an sequence of columns with aliases:

val colNames: Seq[String] = ???
val exprs = colNames.map(c => sum(c).alias(c))
df.groupBy($"id").agg(exprs.head, exprs.tail: _*)
like image 164
zero323 Avatar answered Feb 09 '23 00:02

zero323