Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark: How to kill running process without exiting shell?

Tags:

apache-spark

How can I kill a running process in the Spark shell on my local OSX machine without exiting?

For example, if I just do a simple .count() on an RDD, it can take a while and sometimes I want to kill it.

However, if I do Ctrl C then it kills the whole shell.

Is there a way to kill the process but not the shell?

like image 509
Richard Avatar asked Oct 15 '15 05:10

Richard


1 Answers

You can use the Master Web Interface to kill or Visualize the Job. Also you will find other things there like log file or your cluster working chart...

like image 186
Phan Nghiêm Hải Avatar answered Oct 21 '22 05:10

Phan Nghiêm Hải