Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to execute Column expression in spark without dataframe

Is there any way that I can evaluate my Column expression if I am only using Literal (no dataframe columns).

For example, something like:

val result: Int = someFunction(lit(3) * lit(5))
//result: Int = 15

or

import org.apache.spark.sql.function.sha1
val result: String = someFunction(sha1(lit("5")))
//result: String = ac3478d69a3c81fa62e60f5c3696165a4e5e6ac4

I am able to evaluate using a dataframes

val result = Seq(1).toDF.select(sha1(lit("5"))).as[String].first
//result: String = ac3478d69a3c81fa62e60f5c3696165a4e5e6ac4

But is there any way to get the same results without using dataframe?

like image 292
Kaushal Avatar asked Jul 06 '18 12:07

Kaushal


1 Answers

To evaluate a literal column you can convert it to an Expression and eval without providing input row:

scala> sha1(lit("1").cast("binary")).expr.eval()
res1: Any = 356a192b7913b04c54574d18c28d46e6395428ab

As long as the function is an UserDefinedFunction it will work the same way:

scala> val f = udf((x: Int) => x)
f: org.apache.spark.sql.expressions.UserDefinedFunction = UserDefinedFunction(<function1>,IntegerType,Some(List(IntegerType)))

scala> f(lit(3) * lit(5)).expr.eval()
res3: Any = 15
like image 75
zero323 Avatar answered Nov 08 '22 15:11

zero323