Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to iterate scala wrappedArray? (Spark)

I perform the following operations:

val tempDict = sqlContext.sql("select words.pName_token,collect_set(words.pID) as docids 
                               from words
                               group by words.pName_token").toDF()

val wordDocs = tempDict.filter(newDict("pName_token")===word)

val listDocs = wordDocs.map(t => t(1)).collect()

listDocs: Array

[Any] = Array(WrappedArray(123, 234, 205876618, 456))

My question is how do I iterate over this wrapped array or convert this into a list?

The options I get for the listDocs are apply, asInstanceOf, clone, isInstanceOf, length, toString, and update.

How do I proceed?

like image 627
boY Avatar asked Jul 08 '16 01:07

boY


1 Answers

Here is one way to solve this.

import org.apache.spark.sql.Row
import org.apache.spark.sql.functions._
import scala.collection.mutable.WrappedArray

val data = Seq((Seq(1,2,3),Seq(4,5,6),Seq(7,8,9)))
val df = sqlContext.createDataFrame(data)
val first = df.first

// use a pattern match to deferral the type
val mapped = first.getAs[WrappedArray[Int]](0)

// now we can use it like normal collection
mapped.mkString("\n")

// get rows where has array
val rows = df.collect.map {
    case Row(a: Seq[Any], b: Seq[Any], c: Seq[Any]) => 
        (a, b, c)
}
rows.mkString("\n")
like image 132
Rockie Yang Avatar answered Sep 20 '22 13:09

Rockie Yang