Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to print elements of particular RDD partition in Spark?

How to print the elements of a particular partition, say 5th, alone?

val distData = sc.parallelize(1 to 50, 10)
like image 369
Arnav Avatar asked May 06 '15 12:05

Arnav


2 Answers

Assume you do this just for test purpose, then use glom(). See Spark documentation: https://spark.apache.org/docs/1.6.0/api/python/pyspark.html#pyspark.RDD.glom

>>> rdd = sc.parallelize([1, 2, 3, 4], 2)
>>> rdd.glom().collect()
[[1, 2], [3, 4]]
>>> rdd.glom().collect()[1]
[3, 4]

Edit: Example in Scala:

scala> val distData = sc.parallelize(1 to 50, 10)
scala> distData.glom().collect()(4)
res2: Array[Int] = Array(21, 22, 23, 24, 25)
like image 126
Dichen Avatar answered Oct 23 '22 00:10

Dichen


Using Spark/Scala:

val data = 1 to 50
val distData = sc.parallelize(data,10)
distData.mapPartitionsWithIndex( (index: Int, it: Iterator[Int]) =>it.toList.map(x => if (index ==5) {println(x)}).iterator).collect

produces:

26
27
28
29
30
like image 38
Fabio Fantoni Avatar answered Oct 23 '22 00:10

Fabio Fantoni