Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does Spark exit with exitCode: 16?

Tags:

apache-spark

I am using Spark 2.0.0 with Hadoop 2.7 and use the yarn-cluster mode. Every time, I get the following error:

17/01/04 11:18:04 INFO spark.SparkContext: Successfully stopped SparkContext
17/01/04 11:18:04 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 16, (reason: Shutdown hook called before final status was reported.)
17/01/04 11:18:04 INFO util.ShutdownHookManager: Shutdown hook called
17/01/04 11:18:04 INFO util.ShutdownHookManager: Deleting directory /tmp/hadoop-hduser/nm-local-dir/usercache/harry/appcache/application_1475261544699_0833/spark-42e40ac3-279f-4c3f-ab27-9999d20069b8
17/01/04 11:18:04 INFO spark.SparkContext: SparkContext already stopped.

However, I do get the correct printed output. The same code works fine in Spark 1.4.0-Hadoop 2.4.0 where I do not see any exit codes.

like image 322
Harry Avatar asked Jan 04 '17 16:01

Harry


1 Answers

This issue .sparkStaging not cleaned if application exited incorrectly https://issues.apache.org/jira/browse/SPARK-17340 started after Spark 1.4 (Affects Version/s: 1.5.2, 1.6.1, 2.0.0)

The issue is: When running Spark (yarn,cluster mode) and killing application .sparkStaging is not cleaned.

When this issue happened exitCode 16 raised in Spark 2.0.X

ERROR ApplicationMaster: RECEIVED SIGNAL TERM
INFO ApplicationMaster: Final app status: FAILED, exitCode: 16, (reason: Shutdown hook called before final status was reported.)

Is it possible that in your code, something is killing the application? If so - it shouldn't be seen in Spark 1.4, but should be seen in Spark 2.0.0

Please search your code for "exit" (as if you have such in your code, the error won't be shown in Spark 1.4, but will in Spark 2.0.0)

like image 148
Yaron Avatar answered Nov 15 '22 10:11

Yaron