Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

java.lang.ClassNotFoundException,when I use "spark-submit" with a new class name rather than "SimpleApp",

I wrote a spark program with scala,but when I use "spark-submit" to submit my project,I met the java.lang.ClassNotFoundException.

my .sbt file :

name:="Spark Project"

version:="1.0"

scalaVersion:="2.10.5"

libraryDependencies+="org.apache.spark" %% "spark-core" % "1.3.0"

my .scala file's name is SparkProject.scala and in it object's name is SparkProject too.

/* SparkProject.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object SparkProject {
  def main(args: Array[String]) {
    val logFile = "YOUR_SPARK_HOME/README.md" // Should be some file on your system
    val conf = new SparkConf().setAppName("Simple Application")
    val sc = new SparkContext(conf)
    val logData = sc.textFile(logFile, 2).cache()
    val numAs = logData.filter(line => line.contains("a")).count()
    val numBs = logData.filter(line => line.contains("b")).count()
    println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
  }
}

my command to submit project is :

spark-submit --class "SparkProject" --master local[12] target/scala-2.10/spark-project_2.10-1.0.jar

Anyone knows how to solve this? At last what make me confuse is when I try the example provide here [http://spark.apache.org/docs/latest/quick-start.html],it runs well.But when I build a new project and submit it goes wrong. Any help will be great appreciated.

like image 913
zhang Avatar asked Apr 08 '15 10:04

zhang


2 Answers

Adding package name worked for me.

My code is simple too:

package spark.wordcount

/* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object WordCount {
  def main(args: Array[String]) {
    val infile = "/input" // Should be some file on your system
    val conf = new SparkConf().setAppName("word count")
    val sc = new SparkContext(conf)
    val indata = sc.textFile(infile, 2).cache()
    val words = indata.flatMap(line => line.split(" ")).map(word => (word,1)).reduceByKey((a,b) => (a+b))
    words.saveAsTextFile("/output")
    println("All words are counted!")
  }
}

I tried to run spark-submit like this: [root@sparkmaster bin]# ./spark-submit --class spark.wordcount.WordCount /opt/spark-wordcount-in-scala.jar and it ran successfully.

like image 133
Ben Liao Avatar answered Oct 12 '22 21:10

Ben Liao


Removing package name works for me.

like image 28
Vibhuti Avatar answered Oct 12 '22 23:10

Vibhuti