Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

" hadoop fs -ls " listing files in the present working directory

Tags:

hadoop

hdfs

I am following the Udacity's course on Hadoop which instructs using the command hadoop fs -ls to list files. But on my machine running Ubuntu, it instead list files in the present working directory. What am I doing wrong?

which hadoop commands gives the output: /home/usrname/hadoop-2.5.1//hadoop

Are the double slashes in the path the cause of this problem?

like image 775
dev Avatar asked Dec 10 '25 18:12

dev


2 Answers

You file system must be pointing to local file system. Just modify the configuration to point it to HDFS and restart the processes.

Check this configuration:

 <property>
    <name>fs.default.name</name>
    <value>hdfs://<IP>:<Port></value>
</property>
like image 67
Ashish Avatar answered Dec 13 '25 07:12

Ashish


You have to setup path for hadoop root folder in your current users .bashrc file something as

export HADOOP_HOME=/home/seo/hadoop/hadoop-1.2.1

then add it to your system path variable as

export PATH=$PATH:$HADOOP_HOME/bin:

And then when you use

hadoop fs -ls

will list your hdfs file system file if your hadoop cluster is up and running.

like image 37
Rajen Raiyarela Avatar answered Dec 13 '25 08:12

Rajen Raiyarela



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!