Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I stop a spark streaming job?

I have a Spark Streaming job which has been running continuously. How do I stop the job gracefully? I have read the usual recommendations of attaching a shutdown hook in the job monitoring and sending a SIGTERM to the job.

sys.ShutdownHookThread {   logger.info("Gracefully stopping Application...")   ssc.stop(stopSparkContext = true, stopGracefully = true)   logger.info("Application stopped gracefully") } 

It seems to work but does not look like the cleanest way to stop the job. Am I missing something here?

From a code perspective it may make sense but how do you use this in a cluster environment? If we start a spark streaming job (we distribute the jobs on all the nodes in the cluster) we will have to keep track of the PID for the job and the node on which it was running. Finally when we have to stop the process, we need to keep track which node the job was running at and the PID for that. I was just hoping that there would be a simpler way of job control for streaming jobs.

like image 522
Saket Avatar asked Sep 15 '15 09:09

Saket


People also ask

How do you stop a streaming job?

If all you need is just stop running streaming application, then simplest way is via Spark admin UI (you can find it's URL in the startup logs of Spark master). There is a section in the UI, that shows running streaming applications, and there are tiny (kill) url buttons near each application ID.

How do I stop a spark streaming job in Databricks?

To stop the streaming job just remove the mentioned file with %fs rm -r your_path when using DBFS or just rm -r your_path for the local FS.

How do you turn off spark?

In client mode, your application (Spark Driver) runs on a server where you issue Spark-submit command. In this mode to stop your application just type Ctrl-c to stop. This will exit from the application and prompt your command mode.


1 Answers

You can stop your streaming context in cluster mode by running the following command without needing to sending a SIGTERM. This will stop the streaming context without you needing to explicitly stop it using a thread hook.

$SPARK_HOME_DIR/bin/spark-submit --master $MASTER_REST_URL --kill $DRIVER_ID

-$MASTER_REST_URL is the rest url of the spark driver, ie something like spark://localhost:6066

-$DRIVER_ID is something like driver-20150915145601-0000

If you want spark to stop your app gracefully, you can try setting the following system property when your spark app is initially submitted (see http://spark.apache.org/docs/latest/submitting-applications.html on setting spark configuration properties).

spark.streaming.stopGracefullyOnShutdown=true

This is not officially documented, and I gathered this from looking at the 1.4 source code. This flag is honored in standalone mode. I haven't tested it in clustered mode yet.

I am working with spark 1.4.*

like image 180
ud3sh Avatar answered Oct 11 '22 17:10

ud3sh