Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

value toDS is not a member of org.apache.spark.rdd.RDD

I am trying to write sample Apache Spark program that converts RDD to Dataset. But in that process, I am getting compile time error.

Here is my sample code and error:

code:

import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD
import org.apache.spark.SparkContext
import org.apache.spark.sql.Dataset

object Hello {

  case class Person(name: String, age: Int)

  def main(args: Array[String]){
    val conf = new SparkConf()
      .setAppName("first example")
      .setMaster("local")
    val sc = new SparkContext(conf)
    val peopleRDD: RDD[Person] = sc.parallelize(Seq(Person("John", 27)))
    val people = peopleRDD.toDS
  }
}

and my error is:

value toDS is not a member of org.apache.spark.rdd.RDD[Person]

I have added Spark core and spark SQL jars.

and my versions are:

Spark 1.6.2

scala 2.10

like image 784
user6325753 Avatar asked Jun 16 '17 16:06

user6325753


2 Answers

There are two mistakes I can see in your code.

First you have to import sqlContext.implicits._ as toDS and toDF are defined in implicits of sqlContext.

Second is that case class should be defined outside class scope where the case class is being used otherwise task not serializable exception will occur

Complete solution is as following

    import org.apache.spark.SparkConf
    import org.apache.spark.rdd.RDD
    import org.apache.spark.SparkContext
    import org.apache.spark.sql.Dataset

    object Hello {
      def main(args: Array[String]){
      val conf = new SparkConf()
      .setAppName("first example")
      .setMaster("local")
      val sc = new SparkContext(conf)
      val sqlContext = new SQLContext(sc)

      import sqlContext.implicits._
      val peopleRDD: RDD[Person] = sc.parallelize(Seq(Person("John", 27)))
      val people = peopleRDD.toDS
      people.show(false)
      }
    }
    case class Person(name: String, age: Int)
like image 199
Ramesh Maharjan Avatar answered Nov 11 '22 16:11

Ramesh Maharjan


Spark version < 2.x

toDS is available with sqlContext.implicits._

val sqlContext = new SQLContext(sc);
import sqlContext.implicits._
val people = peopleRDD.toDS()

Spark version >= 2.x

val spark: SparkSession = SparkSession.builder
  .config(conf)
  .getOrCreate;

import spark.implicits._
val people = peopleRDD.toDS()

HIH

like image 40
mrsrinivas Avatar answered Nov 11 '22 16:11

mrsrinivas