Actually i installed and configured my hadoop single cluster using
http://wiki.apache.org/hadoop/Running_Hadoop_On_Ubuntu_Linux_%28Single-Node_Cluster%29
Now when i am using
NameNode - (http://localhost:50070)/ (for my name node) it is running fine but for
JobTracker - (http://localhost:50030)/ it is not working
What can be the case
Thanks
The JobTracker is the service within Hadoop that farms out MapReduce tasks to specific nodes in the cluster, ideally the nodes that have the data, or at least are in the same rack. Client applications submit jobs to the Job tracker. The JobTracker talks to the NameNode to determine the location of the data.
The TaskTracker performs its tasks while being closely monitored by JobTracker. If the job fails, JobTracker simply resubmits the job to another TaskTracker. However, JobTracker itself is a single point of failure, meaning if it fails the whole system goes down. JobTracker updates its status when the job completes.
Job tracker is a daemon that runs on a name node for submitting and tracking MapReduce jobs in Hadoop. It assigns the tasks to the different task tracker. In a Hadoop cluster, there will be only one job tracker but many task trackers. If the job tracker goes down all the running jobs are halted.
After you run $HADOOP_HOME/bin/start-all.sh, you can type a command "jps" to check whether all the neccessary hadoop proccesses have started. If everything is ok, it should be like this:
hd0@HappyUbuntu:/usr/local/hadoop$ jps
18694 NameNode
19576 TaskTracker
19309 JobTracker
19225 SecondaryNameNode
19629 Jps
18972 DataNode
It's possible that your JobTracker proccess is out of work. So check it first. If it's true, then you should look into the log files in the logs directory for a more specific reason.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With