Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

value read is not a member of org.apache.spark.SparkContext

The version of scala is 2.11.8 ; jdk is 1.8 ; spark is 2.0.2

I try to run an example of LDA model in the offical site of spark apache, I got error message from following sentence:

val dataset = spark.read.format("libsvm")
  .load("data/libsvm_data.txt")

The error massage is:

Error:(49, 25) value read is not a member of org.apache.spark.SparkContext val dataset = spark.read.format("libsvm") ^

enter image description here

I don't know how to solve it.

like image 565
Jennie.WU Avatar asked Dec 04 '16 09:12

Jennie.WU


2 Answers

Looks like you're trying to call read on a SparkContext, instead of an SQLContext or a SparkSession:

// New 2.0.+ API: create SparkSession and use it for all purposes:
val session = SparkSession.builder().appName("test").master("local").getOrCreate()
session.read.load("/file") // OK

// Old <= 1.6.* API: create SparkContext, then create a SQLContext for DataFrame API usage:
val sc = new SparkContext("local", "test") // used for RDD operations only
val sqlContext = new SQLContext(sc) // used for DataFrame / DataSet APIs

sqlContext.read.load("/file") // OK
sc.read.load("/file") // NOT OK
like image 179
Tzach Zohar Avatar answered Nov 09 '22 09:11

Tzach Zohar


Add these lines:

import org.apache.spark.sql.SparkSession

val session = SparkSession.builder().appName("app_name").master("local").getOrCreate()

val training = session.read.format("format_name").load("path_to_file")
like image 24
user3503711 Avatar answered Nov 09 '22 09:11

user3503711