Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to add map column in spark based on other column?

I have this table:

|Name|Val|
|----|---|
|Bob |1  |
|Marl|3  |

And I want to transform it to a map with single element like this:

|Name|Val|MapVal|
|----|---|------|
|Bob |1  |(0->1)|
|Marl|3  |(0->3)|

Any idea how to do it in scala? I couldn't find any way to build a map in withColumn statement...

like image 541
ElyashivLavi Avatar asked Mar 09 '23 18:03

ElyashivLavi


1 Answers

Found it - Just need to include the implicit sql:

import org.apache.spark.sql.functions._

And then use the map function: df.withColumn("MapVal", map(lit(0), col("Val")))

like image 186
ElyashivLavi Avatar answered Mar 12 '23 06:03

ElyashivLavi