I'm starting a project using Spark developed with Scala and IntelliJ IDE.
I was wondering how to set -- properties-file
with specific configuration of Spark in IntelliJ configuration.
I'm reading configuration like this "param1" -> sc.getConf.get("param1")
When I execute Spark job from command line works like a charm:
/opt/spark/bin/spark-submit --class "com.class.main" --master local --properties-file properties.conf ./target/scala-2.11/main.jar arg1 arg2 arg3 arg4
The problem is when I execute job using IntelliJ Run Configuration using VM Options
:
--master
param as -Dspark.master=local
--conf
params as -Dspark.param1=value1
--properties-file
Can anyone point me at the right way to set this up?
I don't think it's possible to use --properties-file
to launch a Spark application from within IntelliJ IDEA.
spark-submit
is the shell script to submit Spark application for execution and does few extra things before it creates a proper submission environment for the Spark application.
You can however mimic the behaviour of --properties-file
by leveraging conf/spark-defaults.conf
that a Spark application loads by default.
You could create a conf/spark-defaults.conf
under src/test/resources
(or src/main/resources
) with the content of properties.conf
. That is supposed to work.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With