Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

hadoop fs -put command

I have constructed a single-node Hadoop environment on CentOS using the Cloudera CDH repository. When I want to copy a local file to HDFS, I used the command:

sudo -u hdfs hadoop fs -put /root/MyHadoop/file1.txt /

But,the result depressed me:

put: '/root/MyHadoop/file1.txt': No such file or directory

I'm sure this file does exist.

Please help me,Thanks!

like image 876
skfeng Avatar asked Aug 28 '13 10:08

skfeng


People also ask

What is Hadoop FS command?

The Hadoop fs shell command put is similar to the copyFromLocal, which copies files or directory from the local filesystem to the destination in the Hadoop filesystem.

How do you access Hadoop FS?

Access the HDFS using its web UI. Open your Browser and type localhost:50070 You can see the web UI of HDFS move to utilities tab which is on the right side and click on Browse the File system, you can see the list of files which are in your HDFS. Follow the below steps to download the file to your local file system.

What is in HDFS dfs command?

In Hadoop, hdfs dfs -find or hadoop fs -find commands are used to get the size of a single file or size for all files specified in an expression or in a directory. By default, it points to the current directory when the path is not specified.

How do I delete a non empty directory in Hadoop?

Use an HDFS file manager to delete directories. See your Hadoop distribution's documentation to determine if it provides a file manager. Log into the Hadoop NameNode using the database administrator's account and use HDFS's rmr command to delete the directories.


3 Answers

As user hdfs, do you have access rights to /root/ (in your local hdd)?. Usually you don't. You must copy file1.txt to a place where local hdfs user has read rights before trying to copy it to HDFS.

Try:

cp /root/MyHadoop/file1.txt /tmp
chown hdfs:hdfs /tmp/file1.txt
# older versions of Hadoop
sudo -u hdfs hadoop fs -put /tmp/file1.txt /
# newer versions of Hadoop
sudo -u hdfs hdfs dfs -put /tmp/file1.txt /

--- edit:

Take a look at the cleaner roman-nikitchenko's answer bellow.

like image 175
Alfonso Nishikawa Avatar answered Oct 10 '22 23:10

Alfonso Nishikawa


I had the same situation and here is my solution:

 HADOOP_USER_NAME=hdfs hdfs fs -put /root/MyHadoop/file1.txt /

Advantages:

  1. You don't need sudo.
  2. You don't need actually appropriate local user 'hdfs' at all.
  3. You don't need to copy anything or change permissions because of previous points.
like image 22
Roman Nikitchenko Avatar answered Oct 10 '22 22:10

Roman Nikitchenko


try to create a dir in the HDFS by usig: $ hadoop fs -mkdir your_dir and then put it into it $ hadoop fs -put /root/MyHadoop/file1.txt your_dir

like image 41
elkoo Avatar answered Oct 10 '22 21:10

elkoo