Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there a way to kill reducer task in Hadoop?

Running a few map reduce jobs and one job takes over the all the reducer capacity. Is there a way to kill one or two reducer tasks to free up the cluster?

I can go directly to the one of the task tracker server and kill the java process manually. But I am wondering if there is a more decent way to do this?

like image 217
interskh Avatar asked Oct 17 '13 19:10

interskh


People also ask

How do you kill a running job in Hadoop?

1 Answer. Replace the id or number field with the value you want to eliminate and it will get removed from the running/submitted/accepted job categories. These are some of the finer complexities in Hadoop that you can resolve after taking basic hadoop tutorials.

How do you stop a yarn job?

Another option to stop a running application is to use the YARN command line (this approach does not require port forwarding). You must SSH into the Master node (via bastion) and run the following command: sudo yarn application -kill <application-ID> .

How do you kill a hive job?

Run yarn kill appid to kill an app, if you just type yarn in terminal and hit enter you will see a list of available commands. You need to get familiar with yarn command. You can kill any yarn application with yarn cli including hive, as long as it's hive on yarn.


2 Answers

You can kill the task-attempt by :

hadoop job -kill-task [task_attempt_id]

To get the task-attempt-id, you need to go one level deeper into the task(by clicking on task hyperlink on job tracker).

like image 195
solution Avatar answered Sep 21 '22 01:09

solution


First find the job ID:

hadoop job -list

Now, kill the job:

hadoop job -kill <job_ID_goes_here>
like image 22
cabad Avatar answered Sep 22 '22 01:09

cabad