Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to convert map to dataframe?

m is a map as following:

scala> m
res119: scala.collection.mutable.Map[Any,Any] = Map(A-> 0.11164610291904906, B-> 0.11856755943424617, C -> 0.1023171832681312)

I want to get:

name  score
A  0.11164610291904906
B  0.11856755943424617
C  0.1023171832681312

How to get the final dataframe?

like image 739
Muz Avatar asked Sep 20 '17 08:09

Muz


People also ask

Can we use map on DataFrame in spark?

Spark map() is a transformation operation that is used to apply the transformation on every element of RDD, DataFrame, and Dataset and finally returns a new RDD/Dataset respectively. In this article, you will learn the syntax and usage of the map() transformation with an RDD & DataFrame example.

What is the function of the map () in spark?

A map is a transformation operation in Apache Spark. It applies to each element of RDD and it returns the result as new RDD. In the Map, operation developer can define his own custom business logic. The same logic will be applied to all the elements of RDD.

What is MapType?

The map type determines whether the map shows a standard view, a satellite view, or a hybrid view. The map type is a value of mapkit. Map. MapTypes .


1 Answers

First covert it to a Seq, then you can use the toDF() function.

val spark = SparkSession.builder.getOrCreate()
import spark.implicits._

val m = Map("A"-> 0.11164610291904906, "B"-> 0.11856755943424617, "C" -> 0.1023171832681312)
val df = m.toSeq.toDF("name", "score")
df.show

Will give you:

+----+-------------------+
|name|              score|
+----+-------------------+
|   A|0.11164610291904906|
|   B|0.11856755943424617|
|   C| 0.1023171832681312|
+----+-------------------+
like image 192
Shaido Avatar answered Nov 11 '22 07:11

Shaido