Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Get SparkUncaughtExceptionHandler when run spark-perf

We have setup distributed spark cluster(version 1.5.0) and try to run spark-perf. But we got this error and have no idea how to fix it.

15/10/05 20:14:37 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[appclient-registration-retry-thread,5,main]
java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.FutureTask@43ff6bf rejected from java.util.concurrent.ThreadPoolExecutor@36077c7[Running, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 0]
        at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2048)
        at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:821)
        at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1372)
        at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:110)
        at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1.apply(AppClient.scala:96)
        at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1.apply(AppClient.scala:95)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
        at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
        at org.apache.spark.deploy.client.AppClient$ClientEndpoint.tryRegisterAllMasters(AppClient.scala:95)
        at org.apache.spark.deploy.client.AppClient$ClientEndpoint.org$apache$spark$deploy$client$AppClient$ClientEndpoint$$registerWithMaster(AppClient.scala:121)
        at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anon$2$$anonfun$run$1.apply$mcV$sp(AppClient.scala:132)
        at org.apache.spark.util.Utils$.tryOrExit(Utils.scala:1119)
        at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anon$2.run(AppClient.scala:124)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
15/10/05 20:14:37 INFO DiskBlockManager: Shutdown hook called
15/10/05 20:14:37 INFO ShutdownHookManager: Shutdown hook called
15/10/05 20:14:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-c5a4a63b-3dc5-4c52-bd2b-e6df22a0c19f
like image 674
tobe Avatar asked Oct 05 '15 12:10

tobe


2 Answers

Please check the variable SPARK_CLUSTER_URL in config/config.py.

SPARK_CLUSTER_URL = "spark://Master_Ip:7077"

PS: The Master_Ip is Master's ip address, not hostname.

like image 199
Haifeng Li Avatar answered Sep 30 '22 13:09

Haifeng Li


You have not entered your Spark master URL correctly. It could be because upper case error. Please use this command to ensure the path to hibench.spark.masterin the file conf/99-user_defined_properties.conf is correct. You should be able to connect to the Spark-shell be running the following command.

MASTER=<YOUR-SPARK-MASTER-URL-HERE> bin/spark-shell

In the Spark's standalone mode this url should look something like:

spark://<master-machine-IP>:7077

Generally it is better to use the IP address of Master node instead of the alphabetic hostname that is provided by spark master for e.g spark://Macs-MacBook-Pro.local:7077

like image 22
jaywalker Avatar answered Sep 30 '22 15:09

jaywalker