Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

hadoop fs -ls results in "no such file or directory"

Tags:

uri

hadoop

hdfs

I have installed and configured Hadoop 2.5.2 for a 10 node cluster. 1 is acting as masternode and other nodes as slavenodes.

I have problem in executing hadoop fs commands. hadoop fs -ls command is working fine with HDFS URI. It gives message "ls: `.': No such file or directory" when used without HDFS URI

ubuntu@101-master:~$ hadoop fs -ls 15/01/30 17:03:49 WARN util.NativeCodeLoader: Unable to load native-hadoop  ibrary for your platform... using builtin-java classes where applicable ls: `.': No such file or directory ubuntu@101-master:~$  

Whereas, executing the same command with HDFS URI

ubuntu@101-master:~$ hadoop fs -ls hdfs://101-master:50000/ 15/01/30 17:14:31 WARN util.NativeCodeLoader: Unable to load native-hadoop       library for your platform... using builtin-java classes where applicable Found 3 items drwxr-xr-x   - ubuntu supergroup          0 2015-01-28 12:07 hdfs://101-master:50000/hvision-data -rw-r--r--   2 ubuntu supergroup   15512587 2015-01-28 11:50 hdfs://101-master:50000/testimage.seq  drwxr-xr-x   - ubuntu supergroup          0 2015-01-30 17:03 hdfs://101-master:50000/wrodcount-in  ubuntu@101-master:~$  

I am getting exception in MapReduce program due to this behavior. jarlib is referring to the HDFS file location, whereas, I want jarlib to refer to the jar files stored at the local file system on the Hadoop nodes.

like image 815
Tariq Avatar asked Jan 30 '15 17:01

Tariq


1 Answers

The behaviour that you are seeing is expected, let me explain what's going on when you are working with hadoop fs commands.

The command's syntax is this: hadoop fs -ls [path]

By default, when you don't specify [path] for the above command, hadoop expands the path to /home/[username] in hdfs; where [username] gets replaced with linux username who is executing the command.

So, when you execute this command:

ubuntu@xad101-master:~$ hadoop fs -ls 

the reason you are seeing the error is ls: '.': No such file or directory because hadoop is looking for this path /home/ubuntu, it seems like this path doesn't exist in hdfs.

The reason why this command:

ubuntu@101-master:~$ hadoop fs -ls hdfs://101-master:50000/ 

is working because, you have explicitly specified [path] and is the root of the hdfs. You can also do the same using this:

ubuntu@101-master:~$ hadoop fs -ls / 

which automatically gets evaluated to the root of hdfs.

Hope, this clears the behaviour you are seeing while executing hadoop fs -ls command.

Hence, if you want to specify local file system path use file:/// url scheme.

like image 123
Ashrith Avatar answered Nov 11 '22 14:11

Ashrith