Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hadoop Put command for two files

Tags:

hdfs

A file named records.txt from local to HDFS can be copied by using below command

hadoop dfs -put /home/cloudera/localfiles/records.txt /user/cloudera/inputfiles

By using the above command the file records.txt will be copied into HDFS with the same name.

But I want to store two files(records1.txt and demo.txt) into HDFS

I know that we can use something like below

hadoop dfs -put /home/cloudera/localfiles/records* /user/cloudera/inputfiles

but Is there any command that will help us to store one or two files with different names to be copied into hdfs ?

like image 986
Surender Raja Avatar asked Oct 23 '25 04:10

Surender Raja


2 Answers

With put command argument, you could provide one or multiple source files as mentioned here. So try something like:

hadoop dfs -put /home/cloudera/localfiles/records* /home/cloudera/localfiles/demo* /user/cloudera/inputfiles

From hadoop shell command usage:

put

Usage: hadoop fs -put <localsrc> ... <dst>

Copy single src, or multiple srcs from local file system to the destination filesystem. Also reads input from stdin and writes to destination filesystem.

hadoop fs -put localfile /user/hadoop/hadoopfile
hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile
hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile
Reads the input from stdin.

Exit Code:

Returns 0 on success and -1 on error. 
like image 76
SMA Avatar answered Oct 26 '25 14:10

SMA


It can be done using copyFromLocal coomand as follows :

hduser@ubuntu:/usr/local/pig$ hadoop dfs -copyFromLocal /home/user/Downloads/records1.txt /home/user/Downloads/demo.txt /user/pig/output

like image 26
user2241632 Avatar answered Oct 26 '25 12:10

user2241632