I have installed hadoop 2.6.0 and I'm playing around with it. I'm trying the Pseudo-distributed setup and I'm following the instructions on http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html#Execution I'm stuck at the 5th step i.e. when I run the command
bin/hdfs dfs -put etc/hadoop input
I get the below error.
15/02/02 00:35:49 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
put: `input': No such file or directory
Why am I getting this error? How can I resolve it?
In order to copy a file from the local file system to HDFS, use Hadoop fs -put or hdfs dfs -put, on put command, specify the local-file-path where you wanted to copy from and then HDFS-file-path where you wanted to copy to. If the file already exists on HDFS, you will get an error message saying “File already exists”.
copyFromLocal (or) put: To copy files/folders from local file system to hdfs store. This is the most important command.
In addition to what Ashrith wrote -p can also be added, just in case the directory is not yet created.
bin/hadoop fs -mkdir -p /path/to/hdfs/dir
Hope this helps someone else.
You are getting the error, because there is no such directory specified in the path. Please take a look at my answer to a similar question which explains how hadoop interprets relative path's.
Make sure you create the directory first using:
bin/hadoop fs -mkdir input
and then try to re-execute the command -put
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With