I am following the instructions here: https://spark.apache.org/docs/latest/quick-start.html to create a simple application that will run on a local standalone Spark build.
In my system I have Scala 2.9.2 and sbt 0.13.7.
When I write in my simple.sbt
the following:
scalaVersion := "2.9.2"
after I use sbt package
, I get the error:
sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.9.2;1.3.1: not found
However, when I write in simple.sbt
:
scalaVersion := "2.10.4"
sbt runs successfully and the application runs fine on Spark.
How can this happen, since I do not have scala 2.10.4 on my system?
Scala is not a package, it is a library that executes on top of the Java runtime. Likewise, the Scala compiler scalac
runs on top of a Java runtime. The fact that you have a version of Scala installed in your "system" is a convenience, but is not in any way required.
Therefore, it is entirely possible to launch sbt
from one version of Scala (2.9.2) but instruct it to run other commands (compilation) using an entirely different version of Scala (2.10.x) by passing the appropriate flags such as -classpath
.
See: Can java run a compiled scala code?
As @noahlz said, you don't need Scala on your system as sbt will fetch it for you.
The issue you're having is that there is not spark-core
version 1.3.1
for Scala 2.9.2.
From what I can see in Maven Central (searching for spark-core) there are only builds of spark-core
for Scala 2.10 and 2.11.
Therefore I would recommend you use this setup:
scalaVersion := "2.11.6"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"
If for whatever reason that doesn't work for you, use Scala 2.10.5:
scalaVersion := "2.10.5"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With