In my project, my external library is spark-assembly-1.3.1-hadoop2.6.0
, if I press '.', the IDE inform me toDF()
, but it inform me that can't resolve symbol toDF()
when I code it in. I'm sorry I can't find the toDF()
in Apache Spark doc.
case class Feature(name:String, value:Double, time:String, period:String)
val RESRDD = RDD.map(tuple => {
var bson=new BasicBSONObject();
bson.put("name",name);
bson.put("value",value);
(null,bson);
})
RESRDD
.map(_._2)
.map(f => Feature(f.get("name").toString, f.get("value").toString.toDouble))
.toDF()
To be able to to use toDF
you have to import sqlContext.implicits
first:
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._
case class Foobar(foo: String, bar: Integer)
val foobarRdd = sc.parallelize(("foo", 1) :: ("bar", 2) :: ("baz", -1) :: Nil).
map { case (foo, bar) => Foobar(foo, bar) }
val foobarDf = foobarRdd.toDF
foobarDf.limit(1).show
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With