What are simple commands to check if Hadoop daemons are running?
For example if I'm trying to figure out why HDFS is not setup correctly I'll want to know a way to check if namemonode/datanode/jobtracker/tasktracker are running on this machine.
Is there any way to check it fast without looking into logs or using ps(on Linux)?
Your answer To check Hadoop daemons are running or not, what you can do is just run the jps command in the shell. You just have to type 'jps' (make sure JDK is installed in your system). It lists all the running java processes and will list out the Hadoop daemons that are running.
The daemons of HDFS i.e NameNode, DataNode and Secondary NameNode helps to store the huge volume of data and the daemons of MapReduce i.e JobTracker and Task- Tracker helps to process this huge volume of data. All these daemons together makes Hadoop strong for storing and re- trieving the data at anytime.
Startup scriptsstart-dfs.sh - Starts the Hadoop DFS daemons, the namenode and datanodes. Use this before start-mapred.sh. stop-dfs.sh - Stops the Hadoop DFS daemons. start-mapred.sh - Starts the Hadoop Map/Reduce daemons, the jobtracker and tasktrackers.
In the shell type 'jps' (you might need a jdk to run jps). It lists all the running java processes and will list out the hadoop daemons that are running.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With