Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Killing Spark job using command Prompt

Tags:

apache-spark

What is the command to kill spark job from terminal. I don't want to kill a running spark job via spark UI

like image 633
Surender Raja Avatar asked Apr 05 '17 09:04

Surender Raja


1 Answers

If you are running on yarn use

yarn application -kill applicationID

Get application id from WEB UI or list with yarn application -list

./bin/spark-class org.apache.spark.deploy.Client kill <master url> <driver ID>

or you can look the spark-submit id by the command jps and kill the process but this is not the suggested way

like image 98
koiralo Avatar answered Sep 17 '22 19:09

koiralo