I have such test class:
import org.apache.spark.SparkContext
import org.scalatest.{ConfigMap, BeforeAndAfterAll, FunSuite}
class MyTrainingSuiteIT extends FunSuite with BeforeAndAfterAll {
private[this] var _sc: SparkContext = null
private[this] val defaultCoresNumber = 1
private[this] val defaultMaster = s"local[$defaultCoresNumber]"
private[this] val defaultName = "some-spark-integration-test"
override def beforeAll(configMap: ConfigMap): Unit = {
super.beforeAll()
val mode = configMap.get("mode").get
mode match {
case "local" =>
val coresNumber = configMap.get("cores").get
_sc = new SparkContext(s"local[$coresNumber]", defaultName)
case "docker" =>
println("Docker was chosen")
_sc = new SparkContext(defaultMaster, defaultName)
case "cluster" =>
val clusterType = configMap.get("clusterType").get
println(s"Cluster of type [$clusterType] was chosen.")
_sc = new SparkContext(defaultMaster, defaultName)
case _ =>
println("Unknow mode was chosen")
_sc = new SparkContext(defaultMaster, defaultName)
}
}
override def afterAll(): Unit = {
_sc.stop()
_sc = null
super.afterAll()
}
test("Context testing") {
assert(defaultMaster == s"local[$defaultCoresNumber]")
}
test("Fail test") {
assert(3 === 2)
}
}
Firstly, I compile it in IntelliJ IDEA and then I try to perform it with such command in terminal:
scala -classpath /home/Downloads/scalatest_2.10.jar org.scalatest.tools.Runner -R /home/hspark/datasource-tests.jar -s package.name.MyTrainingSuiteIT -Dmode=local -Dcores=2
And after I do ScalaTest's window is opened and I have such message:
Event: Run Aborted
Message:A needed class was not found. This could be due to an error in your runpath. Missing class: org/apache/spark/SparkContext
Summary: Total number of tests run: 0
Suites: completed 0, aborted 0
Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
Exception: java.lang.NoClassDefFoundError
How can I solve this issue?
This is a working version of the scala command:
scala -classpath /home/Downloads/scalatest_2.10.jar:/home/spark/core-1.2.19.jar org.scalatest.tools.Runner -R /home/hspark/datasource-tests.jar -s package.name.MyTrainingSuiteIT -Dmode=local -Dcores=2
The source of the exception was the missing spark lib on the classpath. As @Ben suggested a build tool such as SBT would make it easier to run your test.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With