I installed Spark to C:\Spark1_6\spark-1.6.0-bin-hadoop2.6
. After navigating to this path I am entering sbt assembly
command and I am getting the following error message:
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Expected ':'
[error] Not a valid key: assembly
[error] assembly
[error] ^
Here is my sbt project structure.
-Project101
-project
-build.properties
-plugins.sbt
-src
-build.sbt
Here is my build.sbt
:
name := "Project101"
version := "1.0"
scalaVersion := "2.10.2"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % "1.6.0" exclude ("org.apache.hadoop","hadoop-yarn-server-web-proxy"),
"org.apache.spark" % "spark-sql_2.10" % "1.6.0" exclude ("org.apache.hadoop","hadoop-yarn-server-web-proxy"),
"org.apache.spark" %% "spark-hive" % "1.6.0",
"org.apache.spark" %% "spark-streaming" % "1.6.0",
"org.apache.spark" %% "spark-streaming-kafka" % "1.6.0"
)
resolvers in Global ++= Seq(
"Sbt plugins" at "https://dl.bintray.com/sbt/sbt-plugin-releases",
"Maven Central Server" at "http://repo1.maven.org/maven2",
"TypeSafe Repository Releases" at "http://repo.typesafe.com/typesafe/releases/",
"TypeSafe Repository Snapshots" at "http://repo.typesafe.com/typesafe/snapshots/"
)
Here is the plugins.sbt
:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.12.0")
sbt package
command is working and able to create the jar file. But I had to execute sbt assembly
command too but is not working.
Not a valid command: assembly
Whenever you face the error message, please make sure you're in the top-level directory of the project with the sbt-assembly plugin installed.
If you have a project in Project101
directory, make sure that project/plugins.sbt
has the line in:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.12.0")
With that, you should again be in Project101
directory and execute sbt assembly
. That should execute the plugin to create an uber-jar.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With