I created a jar package from a project by this file-tree:
build.sbt src/main src/main/scala src/main/scala/Tester.scala src/main/scala/main.scala
where Tester is a class by a function (name is print()) and main has an object to run that prints "Hi!" (from spark documention) created a jar file by sbt successfully and worked well in spark-submit
now I wanna add it into spark-shell and use Tester class as a class to create objects and ... I added the jar file into spark-default.conf but:
scala> val t = new Tester(); <console>:23: error: not found: type Tester val t = new Tester();
Just use the --jars parameter. Spark will share those jars (comma-separated) with the executors.
Use --jars option To add JARs to a Spark job, --jars option can be used to include JARs on Spark driver and executor classpaths. If multiple JAR files need to be included, use comma to separate them. The following is an example: spark-submit --jars /path/to/jar/file1,/path/to/jar/file2 ...
Basics. Spark's shell provides a simple way to learn the API, as well as a powerful tool to analyze data interactively. It is available in either Scala (which runs on the Java VM and is thus a good way to use existing Java libraries) or Python.
you can try by providing jars with argument as below
./spark-shell --jars pathOfjarsWithCommaSeprated
Or you can add following configuration in you spark-defaults.conf but remember to remove template from end of spark-defaults
spark.driver.extraClassPath pathOfJarsWithCommaSeprated
If you want to add a .jar to the classpath after you've entered spark-shell, use :require
. Like:
scala> :require /path/to/file.jar Added '/path/to/file.jar' to classpath.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With