How can I kill a running process in the Spark shell on my local OSX machine without exiting?
For example, if I just do a simple .count()
on an RDD, it can take a while and sometimes I want to kill it.
However, if I do Ctrl C
then it kills the whole shell.
Is there a way to kill the process but not the shell?
You can use the Master Web Interface to kill or Visualize the Job. Also you will find other things there like log file or your cluster working chart...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With