What is the command to kill spark job from terminal. I don't want to kill a running spark job via spark UI
If you are running on yarn use
yarn application -kill applicationID
Get application id from WEB UI or list with yarn application -list
./bin/spark-class org.apache.spark.deploy.Client kill <master url> <driver ID>
or you can look the spark-submit
id by the command jps
and kill the process but this is not the suggested way
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With