I need to copy a folder from local file system to HDFS. I could not find any example of moving a folder(including its all subfolders) to HDFS
$ hadoop fs -copyFromLocal /home/ubuntu/Source-Folder-To-Copy HDFS-URI
Step 1: Make a directory in HDFS where you want to copy this file with the below command. Step 2: Use copyFromLocal command as shown below to copy it to HDFS /Hadoop_File directory. Step 3: Check whether the file is copied successfully or not by moving to its directory location with below command.
From hadoop shell command usage: put Usage: hadoop fs -put <localsrc> ... <dst> Copy single src, or multiple srcs from local file system to the destination filesystem.
cp: Copy files from one directory to another within HDFS, similar to Unix cp command.
You can copy (upload) a file from the local filesystem to a specific HDFS using the fs put command. The specified file or directory is copied from your local filesystem to the HDFS. You can copy (download) a file from the a specific HDFS to your local filesystem using the fs get command.
You could try:
hadoop fs -put /path/in/linux /hdfs/path
or even
hadoop fs -copyFromLocal /path/in/linux /hdfs/path
By default both put
and copyFromLocal
would upload directories recursively to HDFS.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With