Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

sbt - object apache is not a member of package org

I want to deploy and submit a spark program using sbt but its throwing error.

Code:

package in.goai.spark

import org.apache.spark.{SparkContext, SparkConf}

object SparkMeApp {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("First Spark")
    val sc = new SparkContext(conf)
    val fileName = args(0)
    val lines = sc.textFile(fileName).cache
    val c = lines.count
    println(s"There are $c lines in $fileName")
  }
}

build.sbt

    name := "First Spark"

    version := "1.0"

    organization := "in.goai"

    scalaVersion := "2.11.8"

    libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"

    resolvers += Resolver.mavenLocal

Under first/project directory

build.properties

bt.version=0.13.9

When I am trying to run sbt package its throwing error given below.

[root@hadoop first]# sbt package
[info] Loading project definition from /home/training/workspace_spark/first/project
[info] Set current project to First Spark (in build file:/home/training/workspace_spark/first/)
[info] Compiling 1 Scala source to /home/training/workspace_spark/first/target/scala-2.11/classes...
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:3: object apache is not a member of package org
[error] import org.apache.spark.{SparkContext, SparkConf}
[error]            ^
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:9: not found: type SparkConf
[error]      val conf = new SparkConf().setAppName("First Spark")
[error]                     ^
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:11: not found: type SparkContext
[error]      val sc = new SparkContext(conf)
[error]                   ^
[error] three errors found
[error] (compile:compile) Compilation failed
[error] Total time: 4 s, completed May 10, 2018 4:05:10 PM

I have tried with extends to App too but no change.

like image 220
Bhaskar Das Avatar asked Oct 17 '22 18:10

Bhaskar Das


1 Answers

Please remove resolvers += Resolver.mavenLocal from build.sbt. Since spark-core is available on Maven, we don't need to use local resolvers.

After that, you can try sbt clean package.

like image 77
himanshuIIITian Avatar answered Oct 20 '22 23:10

himanshuIIITian