Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use spark-submit's --properties-file option to launch Spark application in IntelliJ IDEA?

I'm starting a project using Spark developed with Scala and IntelliJ IDE.

I was wondering how to set -- properties-file with specific configuration of Spark in IntelliJ configuration.

I'm reading configuration like this "param1" -> sc.getConf.get("param1")

When I execute Spark job from command line works like a charm: /opt/spark/bin/spark-submit --class "com.class.main" --master local --properties-file properties.conf ./target/scala-2.11/main.jar arg1 arg2 arg3 arg4

The problem is when I execute job using IntelliJ Run Configuration using VM Options:

  1. I succeed with --master param as -Dspark.master=local
  2. I succeed with --conf params as -Dspark.param1=value1
  3. I failed with --properties-file

Can anyone point me at the right way to set this up?

like image 943
galix85 Avatar asked Nov 08 '22 12:11

galix85


1 Answers

I don't think it's possible to use --properties-file to launch a Spark application from within IntelliJ IDEA.

spark-submit is the shell script to submit Spark application for execution and does few extra things before it creates a proper submission environment for the Spark application.

You can however mimic the behaviour of --properties-file by leveraging conf/spark-defaults.conf that a Spark application loads by default.

You could create a conf/spark-defaults.conf under src/test/resources (or src/main/resources) with the content of properties.conf. That is supposed to work.

like image 137
Jacek Laskowski Avatar answered Nov 14 '22 23:11

Jacek Laskowski