When I create the jar of my Spark application and try to run it using spark-submit
, I am getting the following error.
This is the command I used to run.
spark-submit --executor-memory 1g --jars s3://test-data-lab-users/spachari/test/test_2.10-1.0.jar
This is the error i am getting. Does this mean I have not passed correct parameters in my spark-submit?
Exception in thread "main" java.lang.IllegalArgumentException: Missing application resource.
at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(CommandBuilderUtils.java:241)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildSparkSubmitArgs(SparkSubmitCommandBuilder.java:160)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildSparkSubmitCommand(SparkSubmitCommandBuilder.java:276)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(SparkSubmitCommandBuilder.java:151)
at org.apache.spark.launcher.Main.main(Main.java:86)
Command exiting with ret '1'
Using --deploy-mode , you specify where to run the Spark application driver program. Spark support cluster and client deployment modes. In cluster mode, the driver runs on one of the worker nodes, and this node shows as a driver on the Spark Web UI of your application. cluster mode is used to run production jobs.
Your spark-submit syntax can be: --class main-class application-jar [application-arguments] --class main-class is the fully qualified name of the class that contains the main method for the Java and Scala application. For SparkPi, the main class would be org.
tl;dr Remove --jars
option and start over.
java.lang.IllegalArgumentException: Missing application resource.
You missed your...well...Spark application that the message refers to as "application resource".
That's more obvious when you execute spark-submit
and see the different command-line options and their meanings.
./bin/spark-submit
Usage: spark-submit [options] <app jar | python file | R file> [app arguments]
That part <app jar | python file | R file>
is what you missed.
To reproduce your issue you can simply execute spark-submit
with --jars
options without specifying the main jar or class of a Spark application.
$ ./bin/spark-submit --jars target/spark-parent_2.11-2.3.0-SNAPSHOT-tests.jar
Exception in thread "main" java.lang.IllegalArgumentException: Missing application resource.
at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(CommandBuilderUtils.java:241)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildSparkSubmitArgs(SparkSubmitCommandBuilder.java:160)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildSparkSubmitCommand(SparkSubmitCommandBuilder.java:274)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(SparkSubmitCommandBuilder.java:151)
at org.apache.spark.launcher.Main.main(Main.java:86)
Quoting spark-submit --help
, --jars
is...
--jars JARS Comma-separated list of jars to include on the driver and executor classpaths.
--jars
can be very helpful when a Spark application depends on additional jar files (aka dependencies), i.e. mysql-connect.jar
that you cannot (or most likely don't want to) "assembly" to your uber jar.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With