Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

hadoop copy a local file system folder to HDFS

Tags:

hadoop

hdfs

I need to copy a folder from local file system to HDFS. I could not find any example of moving a folder(including its all subfolders) to HDFS

$ hadoop fs -copyFromLocal /home/ubuntu/Source-Folder-To-Copy HDFS-URI

like image 515
Tariq Avatar asked Jan 29 '15 11:01

Tariq


People also ask

How do I copy a directory from local to HDFS?

Step 1: Make a directory in HDFS where you want to copy this file with the below command. Step 2: Use copyFromLocal command as shown below to copy it to HDFS /Hadoop_File directory. Step 3: Check whether the file is copied successfully or not by moving to its directory location with below command.

How can I copy multiple files from local to HDFS?

From hadoop shell command usage: put Usage: hadoop fs -put <localsrc> ... <dst> Copy single src, or multiple srcs from local file system to the destination filesystem.

Which command is used to copy files or directories from local file system to HDFS?

cp: Copy files from one directory to another within HDFS, similar to Unix cp command.

Which command do you use to upload from the local file system to HDFS?

You can copy (upload) a file from the local filesystem to a specific HDFS using the fs put command. The specified file or directory is copied from your local filesystem to the HDFS. You can copy (download) a file from the a specific HDFS to your local filesystem using the fs get command.


1 Answers

You could try:

hadoop fs -put /path/in/linux /hdfs/path 

or even

hadoop fs -copyFromLocal /path/in/linux /hdfs/path 

By default both put and copyFromLocal would upload directories recursively to HDFS.

like image 198
Ashrith Avatar answered Sep 24 '22 07:09

Ashrith