Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Use of exponential on columns within scala spark how to make it work

This is the code i wanted to implement . I am getting overload error . Is there a way around it ?

import scala.math._

dF = dF.withColumn("col2",(8.333*exp($"col1")))
error: type mismatch;
 found   : org.apache.spark.sql.ColumnName
 required: Double

How would one perform exponential operations like this one ?

like image 722
Leothorn Avatar asked Nov 18 '25 20:11

Leothorn


1 Answers

You can use same function in spark as

import org.apache.spark.sql.functions.exp

dF = dF.withColumn("col2",exp($"col1"))

You are trying to use exp function from scala.math which requires Double but you are passing Column so it's not working. Spark has the same function you can use that.

Hope this helps!

like image 64
koiralo Avatar answered Nov 21 '25 11:11

koiralo



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!