I am starting to learn spark sql I am using the following dependencies in sbt. I am getting errors
name := "sparkLearning"
version := "1.0"
scalaVersion := "2.11.8"
val sparkVersion = "1.6.1"
val sqlVersion = "1.3.1"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" % "spark-sql" % sqlVersion
)
I am getting an error.
Error:Error while importing SBT project:<br/>...<br/><pre>[info] Resolving com.thoughtworks.paranamer#paranamer;2.6 ...
[info] Resolving org.scala-sbt#completion;0.13.15 ...
[info] Resolving org.scala-sbt#control;0.13.15 ...
[info] Resolving org.scala-sbt#sbt;0.13.15 ...
[info] Resolving org.scala-sbt#run;0.13.15 ...
[info] Resolving org.scala-sbt.ivy#ivy;2.3.0-sbt-48dd0744422128446aee9ac31aa356ee203cc9f4 ...
[info] Resolving org.scala-sbt#test-interface;1.0 ...
[info] Resolving com.jcraft#jsch;0.1.50 ...
[info] Resolving org.scala-lang#scala-compiler;2.10.6 ...
[info] Resolving jline#jline;2.14.3 ...
[info] Resolving org.scala-sbt#compiler-ivy-integration;0.13.15 ...
[info] Resolving org.scala-sbt#incremental-compiler;0.13.15 ...
[info] Resolving org.scala-sbt#logic;0.13.15 ...
[info] Resolving org.scala-sbt#main-settings;0.13.15 ...
[trace] Stack trace suppressed: run 'last *:update' for the full output.
[trace] Stack trace suppressed: run 'last *:ssExtractDependencies' for the full output.
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-sql;1.3.1: not found
[error] (*:ssExtractDependencies) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-sql;1.3.1: not found
[error] Total time: 15 s, completed 27-Jul-2017 15:29:52
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384M; support was removed in 8.0
Please let me know how to resolve this.
SBT Dependencies SBT is the most usual build tool for Scala projects. As a project gets more complex, it will increase the number of dependencies. And each dependency brings other dependencies, called transitive dependencies. Eventually, a project can suffer from the JAR dependency hell.
We can already use Scala 3 to build Spark applications thanks to the compatibility between Scala 2.13 and Scala 3.
SBT is an interactive build tool that is used to run tests and package your projects as JAR files. SBT lets you create a project in a text editor and package it, so it can be run in a cloud cluster computing environment (like Databricks).
correct form for you sbt file is
name := "sparkLearning"
version := "1.0"
scalaVersion := "2.11.8"
val sparkVersion = "1.6.1"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % sparkVersion,
"org.apache.spark" % "spark-sql_2.10" % sparkVersion
)
I would suggest you to use latest spark versions which should be compatible with scala 2.11.8
name := "sparkLearning"
version := "1.0"
scalaVersion := "2.11.8"
val sparkVersion = "2.2.0"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion
)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With