I have set up Apache Spark 1.1.1 to run on YARN (Hadoop-2.5.2). I am able to run programs using the spark-submit
command.
I am using IntelliJ IDEA 14. I am able to build artifacts and run the resulting jar using spark-submit
.
However, I was wondering if it is possible to run the entire program directly from IntelliJ?
I added the necessary libraries and activated the hadoop-2.4 profile. However, I end up getting the following error
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.security.UserGroupInformation.getCredentials()Lorg/apache/hadoop/security/Credentials;
at org.apache.spark.deploy.yarn.ClientBase$class.$init$(ClientBase.scala:58)
at org.apache.spark.deploy.yarn.Client.<init>(Client.scala:37)
at org.apache.spark.deploy.yarn.Client.<init>(Client.scala:43)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:91)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:333)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
at WordCountWorkFlow.main(WordCountWorkFlow.java:24)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Can somebody tell me where I am going wrong?
In Intellij you have to add the dependencies which is path of your Hadoop conf dir
go to project setting and in dependencies add the path $HADOOP_HOME/etc/hadoop
and if you are using any lambda then from project setting ->sources -> language level set 8-lambda type annonation
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With