I want to run a standalone Spark script that I've already compiled with sbt package
command. How could I set the right configuration of Scala Script to run my script in IntelliJ IDE? Currently I'm using the command line with the following command to run it(but I want to run in IntelliJ to further debugging, for example):
~/spark-1.2.0/bin/spark-submit --class "CoinPipe" target/scala-2.10/coinpipe_2.10-1.0.jar /training/data/dir 7 12
Bellow is a snapshot of what I'm trying to do:
Use IntelliJ to create applicationStart IntelliJ IDEA, and select Create New Project to open the New Project window. Select Apache Spark/HDInsight from the left pane. Select Spark Project (Scala) from the main window.
I realize that this post is old, but I ran into the same problem and found a solution so figured I'd post it here.
Create a java application run configuration with main class:
org.apache.spark.deploy.SparkSubmit
VM options should include the classpath of spark conf and jars, at a minimum:
-cp "c:\spark\conf\;c:\spark\jars\*"
Program arguments should contain your jar file as the first argument, followed by the actual program arguments that you wish to pass to your program:
yourapp.jar arg1 arg2
Use classpath of module should be set to your module.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With